Disrupting Project Controls – Fast Forward 20 Years

Disrupting Project Controls – Fast Forward 20 Years

By Kevin Mattheys


Sitting in the OTC office in Wellington, Western Cape (near Cape Town, South Africa), I was looking out of the window at the welcome rain falling gently to the ground and providing the trees and vegetation some much needed sustenance.  I was also thinking about how the world has changed over my lifetime. It was then that my weekly issue of MindBullets popped up in my email account.  MindBullets is a free weekly publication from Future World where they try and predict future events based on current knowledge and technology.

This MindBullets article discussed ‘disruption’, the rate at which it is happening and some of the technologies creating the disruption.  Whilst reading the article, I thought it prudent to write this article on where the project controls industry will find itself 20 years from now as a ‘conversation starter’.  Twenty years might sound like a long time, but it could also happen sooner, so I have been somewhat conservative.  I suspect not much work has currently been done on topic, as there still appears to be a proliferation of spreadsheet and in-the-drawer databases being used.

This phase in the ongoing advancement of humanity is what is commonly referred to as the ‘4th Industrial Revolution’.  Some of the questions that will need to be asked are:

  • Can machines and /or technology help us implement projects better?
  • Will project controls even exist in future and, if so, in what shape or form?
  • How much automation will take place? and
  • What skills will be required to get the best from technology?

These are some of the questions that will need to be tackled in almost all organisations involved in projects and project controls activities.  There will no doubt be advantages and disadvantages, but will the advantages outweigh the disadvantages?

I guess the question can be asked for project management and other disciplines as well, but for now let us focus on project controls.


Innovations in technology and technology related applications continue to proliferate at an astonishing pace.  Moore’s law certainly seems to be playing out in this field.  What was considered revolutionary a few years ago has had its head overturned and is now outdated.

I have been receiving the weekly articles from Future World since 2011.  What tweaked my interest in 2014 was their prediction for a concept called 3D printing (sometimes referred to as additive manufacturing) and ever since then I have been intrigued by this technology.  Of course, this is not the only technology that is causing disruption, but the advances made in the past four to five years in this field have been staggering.

By implication one can probably assume that similar strides are being made in other technologies.  Let us look at each of the technologies out there currently that I am aware of and dissect them a little.  They are not addressed in any particular order but provide a sample for the reader to better understand what is happening behind the scenes so that they can be factored into the conversations.

Current Disruptive Technologies

3D Printing (additive manufacturing)

Essentially this technology uses filaments of various type e.g. plastic, metal, nylon, wood filler, carbon fiber, etc. The filaments are then passed through a heating nozzle which essentially reduces the filament to a liquid type of substance which is then applied layer by layer to the object being produced.

Houses have been produced in some areas of the world using 3D printing (Byttner, 2016).  In Holland, the architectural firm Dus Architects has already printed a ‘Canal House’ with 3D-printers.  Another example is that the Chinese company Yingchuang New Materials in Shanghai is already 3D-printing 10 houses per day.  Recently, a 3D-printed office building was unveiled in Dubai.

Unconfirmed reports state that NASA no longer sends spare parts with their space ships.  If needed, they can simply print a spare part in space. Other applications have already been developed in the medical, manufacturing, healthcare, optics, education and food industries.


Drones are a relative newcomer to the scene.  To the American military, they are UAVs (Unmanned Aerial Vehicles) or RPAS (Remotely Piloted Aerial Systems). However, they are more commonly known as drones.  Drones are used in situations where manned flight is considered too risky or difficult.

Areas where drones are currently being used is in agriculture, recording of live events, surveying dangerous areas, delivery of small items, tracking wildlife, law enforcement and shooting of commercials and movies.  Agriculture is also adapting to this technology to monitor crops, watering patterns and soil suitability.  This is an area that will continue to grow as more and more creative uses are found for these machines.

As a practical application why not use drones to view a project being built and the images sent back to a central location where further analytics, project progressing, etc. can be done without having to go to the site?  Furthermore, if cameras on drones are fitted with appropriate lenses, then e.g. welds can be analysed for cracks, or hot spots can be detected.  The opportunities go on and on.

Artificial Intelligence (AI)

Artificial intelligence (AI) is intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans and other animals. AI strives to replicate human thinking and analysis, but via the machine.  Apart from being adept at playing chess, other areas where AI is currently used is in voice recognition, speech recognition, medical diagnosis and search engines.

If recent predictions are to be believed, then this is a huge growth area.  According to a new market research report, Artificial Intelligence (AI) in Construction Market – Global Forecast to 2023 (MarketsandMarkets™, 2018), the global market is expected to grow from US$ 407.2 Million in 2018 to US$ 1,831.0 Million by 2023, at a compound annual growth rate of 35.1% during the forecast period.

Data Analytics

Data analysis is a process of inspecting, cleansing, transforming, and modeling data with the goal of discovering useful information, suggesting conclusions, and supporting decision-making.  In the past large companies used relational databases to extract information for decision making, but this was somewhat limited.  With the vast amounts of data available both inside and outside the organization, it has now become necessary to analyse this information quicker and more reliably.

Cloud Data Storage

Cloud storage allows world-wide storage and retrieval of any amount of data at any time.  You can use cloud storage for a range of scenarios, including serving website content, storing data for archival and disaster recovery, or distributing large data objects to users via direct download.  I think this is also an area that is going to improve greatly.

Most of our work at OTC is based on working in the cloud and the benefits are not difficult to see.

Internet of Things (IoT)

The Internet of Things is the network of physical devices, vehicles, home appliances and other items embedded with electronics, software, sensors, actuators, and connectivity which enables these objects to connect and exchange data.

We already see widespread application of this technology, but what will the future hold for project controls practitioners?  Working from home, always available, data at the touch of a button.


Blockchain refers to a type of data structure that enables identifying and tracking transactions digitally and sharing this information across a distributed network of computers, creating in a sense a distributed trust network.  The distributed ledger technology offered by blockchain provides a transparent and secure means for tracking the ownership and transfer of assets.

What will this technology bring to the procurement processes of companies?  It will almost certainly ensure the integrity of materials and equipment delivered to site.


Bitcoin, Ethereum, EOS, Ripple, Litecoin, Bitcoin Cash, Binance Coin, IOTA, TRON and NEO.  Do you recognize these terms? If you do not, then I suggest you find out what they are. A good place to start is the article, Bitcoin for beginners (Mayer, 2017).

According to Wikipedia (2018), a cryptocurrency is a digital asset designed to work as a medium of exchange that uses strong cryptography to secure financial transactions, control the creation of additional units, and verify the transfer of assets.  A cryptocurrency is a kind of digital currency, virtual currency or alternative currency. Cryptocurrencies use decentralised control as opposed to centralised electronic money and central banking systems. Who knows how many other ‘currencies’ will pop up in future to challenge the current crop of cryptocurrencies.

Now the questions get interesting.  What does this mean for projects and companies who use traditional payment, financing and procurement currencies?  Are cost controllers and estimators able to control and estimate costs with this technology?  How will the financial reports be prepared and presented? Remember with this technology it is possible that many traditional transaction fee costs, interest rate costs and other costs associated with lending institutions could be rendered for ‘free’.

Facial / Object Recognition

A facial recognition system is a technology capable of identifying or verifying a person from a digital image or a video frame from a video source.  There are multiple methods in which facial recognition systems work, but in general, they work by comparing selected facial features from a given image with faces within a database.

John Holland is an Australian construction company who are actively embracing technology.  In the field of safety, they use facial recognition technology to identify workers who are not wearing the appropriate personal protective equipment (PPE) on site (McLean, 2018).

Other disruptive technologies

Other disruptive technologies which are out there, and which we haven’t even touched on, are clean energy, self-driving vehicles and biotechnology.  It seems as if ‘disruption’ is the new way of the world. The more it can be done, the better it seems.

I sometimes wonder about whether it is always a good thing…

Generic Skills for the Future

The generic skills listed below will remain essential in the future work environment:

  • Complex Problem Solving: The skill to see relationships between industries and craft creative solutions to problems that are yet to appear;
  • Critical Thinking: People who can turn data into insightful interpretations will be sought after due to the complexity and interconnectedness of fields such as computer science, engineering and biology;
  • Creativity: The ability to build something out of ideas is a skill that will pay off now and in future;
  • People Management: Robots may acquire analytical and mathematical skills, but they cannot replace humans in leadership and managerial roles that require people skills;
  • Coordinating with others: Effective communication and team collaboration will be a top demand in any company;
  • Emotional Intelligence: Qualities such as empathy and curiosity for future managers;
  • Judgement and Decision-Making: The ability to condense vast amounts of data with the help of data analytics into insightful interpretations and measured decisions;
  • Service Orientation: People who know the importance of offering value to clients in the form of services and assistance;
  • Negotiation skills: Deriving win-win situations with businesses and individuals will be extremely important; and
  • Cognitive Flexibility: The ability to switch between different persona as the situation demands.

Do your corporate training programs address these areas?  Will you need to up-skill yourself?

The future of project controls Fig 1

My View on Project Controls in 20 Years

I am of the firm opinion that all the above technologies, as well as others which will still be developed in the future, will in some way, shape or form impact on project controls.

Data Analytics (or what is sometimes referred to as ‘Big Data’) will play a large role in project controls enabled by cloud computing. Modelling of plants or facilities will become the norm, and everything will be planned and modelled to the finest detail before any work is done on site.  Schedules will be automatically produced by the modelling software along with cost estimates, etc. from standardised templates.

Imagine a world where statistical simulations (possibly Monte Carlo or some other disruptive application) are the norm on cost estimates and schedules and then live data is used to track and monitor progress related to engineering, procurement, construction etc.  This data is then used to provide an almost real time statistical forecast on the end of job cost and schedule which means a narrowing of the conventional distribution curves of today.  The data is then fed back automatically into the respective data engines leading to significant productivity and forecasting gains for future projects. What then are the implications of a company managing multiple projects concurrently and all this data is available real time? Astonishing to say the least.

There will be fundamental change in procurement (Blockchain) and construction practices (3D printing and recognition software for permitting, site access, etc.).  Collection of as built drawings will no longer be an issue as all drawings are attached to their correct repositories and updated online for immediate storage back into the database.

Project controls in its current format will change and may even merge with the project management function or just become a data engine feed, but with analytical capabilities. The availability of real time data will increase the project controllers’ productivity enormously as the required data will be online and not in some obscure place where it is difficult to find. Collecting, updating and storing data will continue to ensure decisions can be made easier and faster.  Transparency of data will be enhanced.  No more in-the-drawer spreadsheets or databases.  No more hiding of data.  No more manipulating of data and reports to play to someone’s agenda.

Gavin Halse, who is currently a consulting partner at OTC, discusses the impact that product life-cycle management (PLM) may have in an interesting article on the subject.   He contends that in the world of future projects he sees a shift from traditional megaprojects towards smaller, more agile projects that involve many more stakeholders / participants in a new networked economy (Halse, 2017).  This will have a significant impact on project controls because project complexity will increase exponentially. The long waterfall design, build, commission, hand over to the operator, operate will change.

Multi-purpose factories will adapt product lines to toll manufacture on demand, many of the projects will be to retread plant to produce new products – this is like staying in business, but on a much bigger scale. The owner’s role will also change, as will the operator’s. This will mean that project controls and business modelling/operations will converge much more and not operate in separate silos.

I think a new change that will emerge from all of this will be in the commercial / legal fields as I do not believe we have thought about this going forward, e.g. how do we write / manage contracts for a digital age and with so much disruption going on?

Concluding remarks

What was very interesting for me while preparing this article was the lack of articles that have been published on the world wide web relating to use of technology in the project controls or project management arenas.  It may mean that people are actively working on this in the background, but I doubt it.  Another interesting take away is a comment by Byttner (2016) that the construction industry has been somewhat reluctant to get dragged into the digital age and that it is now ready for disruption.

In the article above I have not delved into huge detail, but rather tried to make it light reading to stimulate the thinking and the conversations which will be necessary in preparing for disruption. It should serve as a starting point for tackling the role that will be required of project controls people (and others) in the future.

Whatever your view of the future of project controls is, be assured it will be different.  I would like to convey my thanks to John Hollman and Gavin Halse for providing further insight into this complex topic.


You may also be interested in

Excellence in Capital Projects

Excellence in Capital Projects

Join experts from multiple industries with complex capital projects at the core of their experience and business models. What do capital projects in the manufacturing, basic materials, and healthcare industries have in common? Where are they different? How can we learn from one another?

read more
Develop and Analyse Project Schedules for Realism – Part 2

Develop and Analyse Project Schedules for Realism – Part 2

By Kevin Mattheys

This is the second part of a 2-part series of articles covering the development and control of robust, but realistic, project schedules.  The focus of the articles is as follows:

  • Part 1 – A composite scheduling process (published November 2016); and
  • Part 2 – Assessing schedules for realism.

This second part is an overview of schedule assessment metrics that can be used to ensure that project schedules are realistic.


It is a common management phrase which states that “if you want to manage it, then measure it”. To this end, various metrics have been developed and used by various organisations to assess the quality of schedules to improve project performance.  Some of the more well-known are:

  • The Defense Contract Management Agency (DCMA), which has developed criteria for performing an objective and thorough analysis of a schedule;
  • The National Defense Industrial Association (NDIA), in its Planning and Scheduling Excellence Guide (PASEG) (2016), describes, in addition to the DCMA criteria, a set of generally accepted scheduling principles (GASP); and
  • The US Government Accountability Office (GAO) (2015) with its guide, the GAO Schedule Assessment Guide: Best Practices for Project Schedules.

While the guides mentioned above are all written with a focus on government projects, I have not found anything that is not applicable to projects in general.  In fact, many times in the past it has been shown that the United States of America’s Defense Force are at the leading edge of developing useful procedures, standards or other tools that become useful to industry in general.  In this article, we’ll consider each of these guides in turn, and then try to identify areas of difference that can be considered useful in putting an enhanced or improved schedule quality assessment system in place.


According to Winter (2011), the USA Under Secretary of Defense for Acquisition and Technology mandated the use of an Integrated Master Schedule (IMS) for contracts greater than US$20 million in March 2005. He also directed the DCMA to establish guidelines and procedures to monitor and evaluate these schedules. The DCMA then internally produced a programme in response to this requirement and released their 14-Point assessment checks as a framework for schedule quality control.

However, according to the National Defense Industrial Association (NDIA), the generally accepted scheduling principles (GASP) were originally developed as a governance mechanism for the Program Planning and Scheduling Subcommittee (PPSS). The PPSS is a subcommittee formed by the Industrial Committee on Program Management (ICPM) working group under the auspices of NDIA. The GASP was thus developed collaboratively with inputs from both Government and Industry.

I think that in the context of this article, the important issue is that there is a GASP, irrespective of who developed it.  It is also essential to understand that GASP is intentionally broad and sets high expectations for excellent scheduling, yet do not specify methodologies. Therefore, avoid viewing the GASP as dogma; instead, continually strive to meet or exceed the GASP with creative and practical approaches that work for the size, value, risk, and complexity of the program and the skills and capabilities of the programme team.

The DCMA Generally Accepted Scheduling Practices

The Defense Contract Management Agency’s schedule assessment system helps in determining schedule consistency, allows for constructive discussions based on the analysis, facilitates setting of realistic schedule baselines, is based on proven metrics and provides two tripwires for early detection of possible deviations against standard.

The generally accepted scheduling principles (GASP) are eight over-arching tenets for building, maintaining, and using schedules as effective management tools. The GASP is concise and easily understood, yet sets high expectations for programme or project management teams to develop and use schedules.

The first five GASP tenets describe the requisite qualities of a valid schedule; that is, one that provides complete, reasonable, and credible schedule information based on realistic logic, durations, and dates. The latter three GASP tenets reflect increased scheduling maturity that yields an effective schedule. An effective schedule provides timeous and reliable data, aligns time-phased resources, and is built and maintained using controlled and repeatable processes. Figure 1 below shows the groupings.


Figure 1:   The GASP tenet groupings

The GASP tenets serve several purposes, including that:

  • They provide high level targets for sound scheduling;
  • They also serve as a validation tool for the programme team or organisation to assess the schedule maturity or areas needing improvement;
  • They can be used as a governance tool to assess new or different scheduling approaches with objectivity and detachment; and
  • They can be used as a compliance standard in contracts for the accepted schedule.

Achieving a GASP-compliant schedule indicates it is not merely healthy, but fit. A healthy schedule is functional and meets minimum management purposes, but a fit schedule is robust and dynamic.

Let us now consider each of the 8 tenets and where the 14-point assessment checks fit in, starting with the “Valid” group of tenets:

  • Tenet 1 – Complete: Schedules must represent authorised discrete effort for the entire contract.
    • Assessment Point 1: High Duration – A high proportion of activities with this condition indicates a distinct lack of sufficiently detailed plans.
  • Tenet 2 – Traceable: Schedules reflect realistic and meaningful network logic that horizontally and vertically integrates the likely sequence for project execution. Coding plays a large part in this principle.
    • Assessment Point 2: Missing Logic – This metric is used to test the “completeness” of the schedule and see how well the activities in the schedule are linked together.
    • Assessment Point 3: Relationship Types – The use of Finish to Start should be the most widely used, Start to Start and Finish to Finish used sparingly and Start to Finish used by exception.
  • Tenet 3 – Transparent: Schedules provide full disclosure of project status and forecast and include documented ground rules, assumptions, schedule building and maintaining methods, approach to analysing critical paths, etc.
    • Assessment Point 4: Activities with Leads – The use of leads (negative lags) normally indicates that the schedule is not detailed enough or the scheduler is trying to “fix” activity dates. This makes it harder to analyse the critical path, distorts the total float in the schedule and may cause resource conflicts.
    • Assessment Point 5: Activities with Lags – The use of excessive lags complicates the schedule to the point of not being maintainable and ineffective in forecasting future dates. This makes it harder to analyse the critical path.
  • Tenet 4 – Statused: Schedule reflects consistent and regular updates of completed work, interim progress, achievable remaining durations relative to status date and accurately maintained logic relationships.
    • Assessment Point 6: Invalid Dates – Activities that have not started should not have projected start or finish dates before the status date. Activities should also not have actual start or finish dates after the status date.
    • Assessment Point 7: Missed Activities – This metric checks for the number of activities that have missed their baseline finish dates and are therefore not meeting the baseline plan.
    • Assessment Point 8: Baseline Execution Index (BLI) – Measures the percentage of activities completed as a percentage of the activities that should have been completed.
  • Tenet 5 – Predictive: Schedules accurately forecast the likely completion dates and impacts to the project baseline plan through valid network logic and achievable task durations from the status date to project completion.
    • Assessment Point 9: High Float – Excessive amounts of positive float may indicate incomplete schedule logic or an unstable network. Large negative float values may indicate a scheduler capturing the date incorrectly.
    • Assessment Point 10: Negative Float – Float less than 0 working days is considered negative float. This indicates that the project is overrunning its stated completion date and re-planning needs to take place to correct the situation.
    • Assessment Point 11: Critical Path Test – This test must be run manually. It is used to check the integrity of the critical path by ensuring that a delay introduced into the path has the appropriate impact on the end date.
    • Assessment Point 12: Critical Path Length Index (CPLI) – This is an indicator of the likelihood of completing the schedule on time and measures the relative efficiency required to complete the schedule on time.

The three tenets for the “Effective” group are:

  • Tenet 6 – Usable: Schedules provide meaningful metrics for timely and effective communication, tracking and improving performance, mitigating issues and risks as well as capturing opportunities.
    • Assessment Point 13: Hard constraints – The overuse of date constraints is considered one of the most common abuses of a schedule. Overuse leads to a “hardening” of the schedule and limits the forecasting capability of the schedule.
  • Tenet 7 – Resourced: Resources align with the schedule baseline and forecast to enable stakeholders to view and assess the time-phased labour and other costs required to achieve project baseline and forecast targets.
    • Assessment Point 14: No Assigned Resources – Without resources assigned it is extremely difficult to determine if there is a fighting chance of completing the activity / project. Resource profiles indicate resourcing needs and show if progress is being achieved / maintained at the rate planned for.
  • Tenet 8 – Controlled: Schedules are baselined and maintained using a rigorous, stable and repeatable process.  Schedule additions, deletions and updates conform to this process.

The United States Government Accountability Office (GAO) Schedule Assessment Guide

GAO’s research tells us that the four characteristics of a high-quality, reliable schedule are that it is comprehensive, well-constructed, credible, and controlled.

A comprehen­sive schedule includes all activities for both the government (owner) and its contractors necessary to accomplish a program’s objectives as defined in the program’s WBS. The schedule includes the labour, materials, travel, facilities, equipment, and the like needed to do the work and depicts when those resources are needed and when they will be available. It realistically reflects how long each activity will take and allows for discrete progress measurement.

A schedule is well-constructed if all its activities are logically sequenced with the most straightforward logic possible. Unusual or complicated logic techniques are used judi­ciously and justified in the schedule documentation. The schedule’s critical path rep­resents a true model of the activities that drive the program’s earliest completion date, and total float accurately depicts schedule flexibility.

A schedule is credible if it is horizontally traceable—that is, it reflects the order of events necessary to achieve aggregated products or outcomes. It is also vertically traceable: activities in varying levels of the schedule map to one another and key dates presented to management in periodic briefings are in sync with the schedule. Data about risks are used to predict a level of confidence in meeting the program’s completion date. Neces­sary schedule contingency and high-priority risks are identified by conducting a robust schedule risk analysis.

Finally, a schedule is controlled if trained schedulers update it regularly using actual progress and logic, based on information provided by activity owners, to realistical­ly forecast dates for programme activities. Updates to the schedule are accompanied by a schedule narrative that describes salient changes to the network. The current schedule is compared against a designated baseline schedule to measure, monitor, and report the program’s progress. The baseline schedule is accompanied by a basis document that explains the overall approach to the program, defines ground rules and assumptions, and describes the unique features of the schedule. The baseline schedule and current schedule are subjected to configuration management control.

The GOA also indicate that there are 10 scheduling best practices required to achieve the four characteristics of a high quality, reliable schedule.  They are self-explanatory and I will not delve into any of them here; suffice to say they are mostly represented and described in the DCMA literature discussed earlier.  Table 1 shows how the 10 scheduling best practices are mapped to these four characteristics.

Table 1:   The GOA Schedule Characteristics and Best Practices


The United States National Defense Industrial Association (NDIA)

The NDIA Planning & Scheduling Excellence Guide (PASEG) published this guide in March 2016 to provide the programme management team, including new and experienced master planners/schedulers, with practical approaches for building, using, and maintaining an Integrated Master Schedule (IMS). It also identifies knowledge, awareness, and processes that enable the user to achieve reasonable consistency and a standardized approach to project planning, scheduling and analysis.

Sound schedules merge cost and technical data to influence programme management decisions and actions. Realistic schedules help stakeholders make key go-ahead decisions, track and assess past performance, and predict future performance and costs. Industry and Government agree that improving IMS integrity has a multiplier effect on improved programme management.

The three sections in the document that are of real interest to me are listed in Table 2.  This NDIA PASEG (available on the internet), can also serve as a useful guide for relatively new planners.

Table 2:   Three important sections of the NDIA PASEG


Key Observations

In both guides the GASP features prominently.  If we then adopt the GASP as the minimum schedule quality standard and do a comparative analysis with the GAO Schedule Assessment Guide and the NDIA Planning & Scheduling Excellence guide, the following can be observed:

Firstly, regarding the GAO Schedule Assessment Guide:

  • In this guide the GASP, by and large, is described with the same or similar words.
  • The guide seems to emphasise the planning process, rather than interrogating the schedule and analysing it with measurable results.
  • What I did like, and which will be included in the minimum standard, is best practice 1 “Capturing All Activities”. If one looks at the description, then it is very clear that the work breakdown structure (WBS) is central to this, as in theory “all activities need to link to a WBS element and all WBS elements need to link to an activity”.  This clearly then would need to be incorporated into tenet 1.

Secondly, regarding the NDIA Planning & Scheduling Excellence Guide:

  • This is an excellent and rather comprehensive guide to planning and scheduling. Certainly, value will be obtained by using this guide as a reference.
  • I love the fact that this guide focuses on the Integrated Master Schedule as its point of centrality for schedules. This is critical for large or complex projects or programmes. I would suggest that a measure to be added to GASP is whether all schedules linking to the Master Schedule have a minimum GASP quality measurement rating.
  • Another like is the “Leadership, Buy in and Commitment” section. This is an area that generally is very problematic. Do approved roles and responsibilities for all parties involved in the schedule development and maintenance exist and how to measure these?  I also support introducing the Current Execution Index (CEI), which measures the team’s ability to accurately forecast ahead.
  • This guide also emphasis Earned Value analysis and one can then add Schedule Performance to the mix. Cost Performance would be reported elsewhere as the GASP is purely schedule related.

Concluding remarks

This has been a fascinating exercise spurred on by Jurie Steyn, the editor of these articles, challenging me to expand and improve on the generic minimum standard GASP discussed at the start of this article.  By identifying additional value adding schedule measures, the quality of programme / project schedules improves, and hence the success of ventures improves.

In summary, four new measures will be added to the minimum standard GASP used within OTC.  This is by no means an all-inclusive list and we will continue searching for ways to improve on this to the benefit off all.  If you have any suggestions after reading this article, please pass them on and let us get the mindset entrenched in the industry that quality schedules are the new norm and the preferred standard.


DCMA (Defense Contract Management Agency), Date unknown, Generally Accepted Scheduling Practices (GASP).

GAO (United States Governance Accountability Office), 2015, GAO-16-89G Schedule Assessment Guide.

NDIA (National Defense Industrial Association), 2016, Planning & Scheduling Excellence Guide (PASEG), Version 3.0.

Winter, R., 2011, DCMA 14-Point schedule assessment, Available as a PDF from http://www.ronwinterconsulting.com/DCMA_14-Point_Assessment.pdf.   Accessed on 30 November 2016.


You can download this and other Insight articles from our website at http://www.ownerteamconsult.com/download-insight-articles/

Develop and Analyse Project Schedules for Realism

Develop and Analyse Project Schedules for Realism

By Kevin Mattheys

This is the first of a 2-part series of articles covering the development and control of robust, but realistic, project schedules.   The focus of the articles is as follows:

The first part covers a project scheduling process which takes from the best processes available and consolidates it into a composite scheduling process.


When presented with a project schedule which shows the correct durations and task dates, we generally assume the schedule is acceptable.  We then establish the project baseline against which the project will be progressed.  Not long after the project kicks off, things start to drift from plan and then it becomes an ongoing exercise to continually replan and rebaseline.  This situation gradually deteriorates and eventually leads to project delays, delay claims and, in some instances, lengthy and expensive court battles.  As is so often the case in life, the devil lies in the detail.

During my career, I have regularly encountered this phenomenon and must attest to many hours of frustration because one, or more, of the following requirements were not adhered to:

  • Basic rules of schedule development and control;
  • Minimum standards for what a quality schedule looks like;
  • Documented backup on how the schedule has been compiled;
  • Proper signoff and communication of the baseline schedule; and
  • A proper change management process.

It became evident that something was amiss in the underlying structure and processes of these schedules.  There appeared to be quality and governance issues at stake which, truth be told, are some of the main drivers of a realistic schedule.  This series of articles will highlight and suggest ways in which this situation can be improved.

Composite Scheduling Process

There have been numerous books written on scheduling; suffice to say that it is very important that a planner / scheduler follows a rigid formal planning process.    Examples of formal planning and scheduling processes can be found in the Project Management Body of Knowledge (PMI, 2013), literature and articles from the American Association of Cost Engineers (Crawford, 2011; Stephenson, 2015), and books such as Project management using earned value, 2nd edition, by Humphreys (2002).

There are certain nuances to each of these high-level processes, which, if selectively stripped out and used to create a composite process, will more fully describe the process for creating a project schedule. Over a period, my personal preference has swung towards using the approach of Humphreys (2002), primarily because of the introduction of the quality assessment sub-process in conjunction with one or two modifications which will address the fundamental frustrations highlighted in the introduction.

The composite schedule development and control process is highlighted in Table 1.  Note that the process described below is not a linear sequential process, but goes through a series of iterations to get to the eventual outcome.

Table 1:   The Schedule Development and Control Process


Let us now discuss each of these in a little more detail.  The focus of the balance of this article will only be on the schedule development and control process.  The second article will address the quality assessment sub-process and the schedule basis document.

The first step in the process is the preparation of the schedule so that we can establish the baseline which the project will be measured against.

Prepare the Schedule

Preparing the schedule comprises the following seven steps:

  • Develop the Network:  This sub-process is the start of schedule preparation.  As part of the inputs a set of target dates are required which could be startup related, market related or project completion related.  The list of schedule activities is derived from the scope which should be in the form of a Work Breakdown Structure (WBS), activity durations are determined and the relevant logic applied.  The various project execution strategies will also determine logic and grouping of activities so bear this in mind.   Once this has been completed, one has what is referred to as a network diagram.
  • Perform Critical Path Analysis:  To determine a critical path (the longest path through the network) for a project, a series of backward and forward passes of the network are simulated by the planning software.  The resultant critical path is the sequence of activities such that if any one of the activities were to be delayed, it would cause the project to be delayed.
  • Resource Loading:  It has been statistically proven that proper attention to resource loading leads to improved project outcomes and more reliable schedules.  Resource loading and critical path analysis are interchangeable, as some planners like to do the critical path analysis after the resource loading exercise has been undertaken.  I am personally not too fussed, so long as both actions occur iteratively until an optimum solution is achieved.
  • Crash the Network:  “Crashing” is a technique whereby various strategies related to overtime, additional manpower or sequencing of work are investigated to determine if there are opportunities to reduce the project duration.  This is a lengthy and complex process and usually leads to reduced project durations.  However, the additional costs need to be weighed against the improved outcomes.
  • Schedule Quality Assessment:  This is a sub-process that I believe is extremely underutilised, yet assists greatly in enhancing the realism of the schedule.  Until recently, there did not appear to be much research on this aspect of the schedule.  In the absence of anything substantive this set of criteria can be used to analyse a schedule and determine fundamental problems which will need to be corrected.  It is also not a panacea for all schedule issues but can be used as a minimum standard which can be added to over time. More on this in next month’s follow up article.
  • Risk Analysis:  This sub-process should not be undertaken until a proper quality assessment has been conducted.  Incorrect use of logic, relationships, constraints, float and long durations could have a detrimental effect on the outcomes of the risk analysis and render the whole exercise invalid if we do not use a realistic schedule. The object of a schedule risk analysis is to determine the possible impact on project durations should the probability of certain risks materialize or events occur.  After the analysis, a range of probabilistic dates are presented and the team decides on an appropriate level of risk.  This then becomes the schedule contingency (like cost contingency) and should be managed in the same way.
  • Benchmark the Schedule:  Benchmarking appears to be an activity that takes place on an ad hoc basis.  When initially establishing the project schedule it is imperative to conduct a benchmarking exercise to determine the realism of the proposed project schedule before it becomes the de facto baseline.  Once the initial baseline has been established it may not be necessary to conduct a benchmarking exercise again unless there has been a significant de-scoping or scope increase.

Baseline the Schedule

Setting a Traceable Schedule:  On completion of Schedule Preparation sub-processes, it is important to put the baseline in place.  Included in this exercise, all the elements for assessing progress to determine progress objectively should be included.  Without having an established and agreed baseline the team will be flying blind once progress measurement starts.

The exact measures aka Earned Value criteria need to be in the schedule so there is no debate about the percent complete as it should simply be a question of determining whether the activity is complete or not?

  • Prepare and Signoff on Schedule Basis Document:  Another area that perplexes me is the lack of preparation and / or upkeep of the schedule basis document.  This document describes the scope, explains the inner constructs of the project schedule, progressing criteria, assumptions, exclusions etc.  Planners apart, not many people can understand the inner workings of the schedule so it is vital that this information is captured in an easy to read and understandable way. It is critical that this document is used to get parties to sign off on the schedule before the baseline is set. A typical schedule basis document should contain the following:
    • The Plan – This should include a list and description of the activities with their planned dates (start and finish).  A complete scope of works should also be added;
    • Schedule control baseline – A time phased, logically linked, resource loaded, detailed interpretation of the plan based on an aggregation of a common attribute of all activities to be measured and assessed.
    • Planned schedule – A list of activities with their planned date information usually illustrated as a bar chart;
    • Schedule basis – A description of the activities and resources covered, included methodologies, standards, references and defined deliverables, assumptions, inclusions and exclusions made, key milestones and constraints, calendars and some indication of the level of risk and uncertainty used to establish schedule contingency;
    • Schedule control plan – The progress weighting and measurement approach should be included.  It should include a description of how project performance should be measured and assessed in respect of the schedule including rules for earning progress and the procedures for evaluating progress and forecasting remaining durations.  This should all form part of the schedule basis document;
    • Quality Assessment Results – The results of the quality assessment should accompany the schedule which this assessment relates to; and
    • Signoff – Ensure signoff from all parties that this is the accepted schedule and baseline reflecting the scope of work as described earlier.

Monitor and Control the Schedule

  • Update the Schedule:  The regular sub-process of updating the schedule now commences with daily, weekly or monthly updates and reports against the baseline.
  • Schedule Change Management:  An area that does not get the attention it deserves is in schedule change management.  Too often the focus is on cost without understanding that a slippage on schedule may also have an impact on cost.  In addition, this is the only mechanism through which the project baseline can be changed.  The planner must ensure that he makes no changes to the schedule unless formally agreed to and signed off via a formal change management process.
  • Schedule Risk assessment:  Ongoing quarterly or bi-annual risk assessments should be conducted to test for the realism of the schedule in being able to accommodate potential risks.

Concluding remarks

In this article, I’ve highlighted some of the concerns I believe are hindering the development of realistic schedules.  I then identified four key areas (there may be more), which if implemented properly, will lead to significantly improved realism in schedules.  These are:

  • a well though through and consistent schedule development and control process;
  • a schedule quality assessment conducted frequently (GASP);
  • a basis document used for signoff; and
  • benchmarking.

Whist many people will be saying this takes a lot of time the effort is well worth it in the long run.  The effort is only expended in the initial setup and development, but thereafter it is pure maintenance.


Crawford, T.X., 2011, Professional Practice Guide #4: Planning and scheduling, 3rd edition, AACE International.

Humphreys, G.C., 2002, Project Management using Earned Value, 2nd Edition, Humphreys and Associates, Inc.

PMI (Project Management Institute), 2013, A guide to the project management body of knowledge (PMBOK® Guide), 5th edition., Project Management Institute, Inc.

Stephenson, H.L., 2015, TCM Framework: An integrated approach to portfolio, program and project management, 2nd edition, AACE International.


You can download this and other Insight articles from our website at http://www.ownerteamconsult.com/download-insight-articles/

How important is the Project Master Schedule?

How important is the Project Master Schedule?

Project Master Schedule   In conversation with a couple of colleagues the other day this simple question was raised.  Initially there was disbelief that the question could even arise.  After all who used a Project Master Schedule these days? Responses ranged from :

  • “We analyse the detail to pick up issues and concerns and address them at the lowest level of the schedule”;
  • “We only produce one at the early stages of the project to give us a high level idea of the project durations and then it gets put into someone’s drawersomewhere”;
  • “Its hard work keeping it up to date”;
  • “We extract only activities from the month before and month after the status date to report on”.

I was rather taken aback at these responses as I have always used the Project Master Schedule on projects I have worked on to give me a quick overview of how the project is doing.  If I pick up an anomaly I then delve into the detail to find the problem that needs to be addressed.

So what is a Project Master Schedule?

To me it is a one page summary of the total project indicating the various phases, key milestones / deliverables which ties together Front-end Loading and Implementation (Engineering, procurement, construction and commissioning) in such a way that it addresses the business need-by dates.  It is a schedule that reflects planned, actual, forecast and status dates. Of course there can be variants to this but in its simplest form that is it.  

Key features of a master schedule include:

  • Simple to read, ideally one page only, used as a communication tool for senior management and the team;
  • Is a dynamic document for the duration of the project; and
  • The lowest level of the schedule and the highest level of the schedule must resonate with each other.

  In summary it is one of two critical documents I use to determine if we are in control of the project.  The other is the Earned Value graph but more of that later. What do you use on your projects ? Is it something similar ? Do you use one at all ?  Please share your thoughts. Remember, if you are a member you can download your own copy of our  practical guide on how to implement a project master schedule from OTC Toolkits at www.otctoolkits.com

My Five Best Cost Engineering Practices

My Five Best Cost Engineering Practices

This is the second part of a two-part series of articles on cost engineering and project controls by Kevin Mattheys. The two parts are as follows:

  • Part 1 – What is Cost Engineering?
  • Part 2 – My five best Cost Engineering practices

In Part 1 I presented some background information and discussed how cost engineering differs from project controls. In Part 2, I share my five best cost engineering practices, based on many years’ experience.


Following on from last months article where I espoused my views on Cost Engineering and where it fits in, I will now discuss five particular practices, which if addressed correctly and diligently, generally lead to improved outcomes on projects. This is by no means an exhaustive list however it is practices in which I have experienced both the positives and negatives and have learnt enormously from them over the years.

The Database (or data model as some call it)

Crucial to the success of any endeavour is the ability to access information that is relevant, timeous and trustworthy and on which sound decisions can be made. An integrated database that is continuously updated, normalised with current project information, tested against international norms and standards and allows project teams to estimate, plan and control from early in idea development until final completion of the project is worth its weight in gold. This is reflected in Figure 1.

Crucial to the success of this continual data recycling is the use of a well thought out and structured set of ‘codes’ so that this myriad of information is able to be dissected and reported on efficiently and in a meaningful way. These ‘codes’ are crucial also for quickly assimilating data for storage back into the estimating repositories which leads to better future estimates. These codes typically relate to the Work Breakdown Structure (WBS), the Organisation Breakdown Structure (OBS) and the Cost Breakdown Structure (CBS). Another important code is what is referred to as the project Code Of Accounts (COA) which identifies equipment and other related costs into ever increasing levels of detail.

Unfortunately, too often an organisation is not willing to invest in the database as it is deemed to be ‘too expensive’ leading to a gradual degradation and decline of both the system and content over time. This is not a ‘nice to have’ anymore, it is an absolute essential and for a number of companies this is their competitive advantage.

There are systems out there but what happens far too frequently is that companies try and make financial systems do project control work with pretty disastrous results more often than not.

From integrated database to improved decision-making

Figure 1:  From integrated database to improved decision-making


The Team

A Cost Engineering team will usually have a mix of leadership/managerial, technical, analytical and administrative competencies on large projects or programmes. The team must work together and should possess at least two traits which are somewhat unique. These are the ability to be methodical and disciplined. These people are subjected to mountains of data and they need to be methodical in analysing and reporting this information. They also need to be disciplined as reports are required on time, every time. Reports need to be accurate and meaningful, otherwise credibility is compromised and that is tough to restore.

One critical success factor is the application of a disciplined process of Plan, Do, Check and Assess (see Figure 2 below) for all project phases and monthly reporting cycles. Once you have delivered a report late or the integrity of the information is suspect you will be continuously bombarded with queries of all sorts. One should also ensure you have an in depth understanding of the details behind the reports so that if a question is asked you have the answer.

The Total Cost Management Framework process (there are others as well)

Figure 2:  The Total Cost Management Framework process (there are others as well)


An area within the team that frequently gets misunderstood by senior management, however, is the Project Control/Cost Control and Financial Control functions. This is because the word ‘cost’ is deemed to be common to both so they must therefore do the same work. Nothing is further from the truth. Project Control/Cost Control is the control of the project scope against a budget and time frame and then forecasting where the costs and schedule will end. It is forward looking and attempts to anticipate where the project is heading whilst proactively identifying possible future pitfalls and preparing risk mitigation plans.

Financial control (backward looking) is about making the requisite payments to contractors and suppliers timeously against signed off and approved invoices and then reflecting these costs correctly in company balance sheets, departmental cost centres and/or company asset registers. These two functions should never be confused with each other, yet, in practice this happens all too frequently.

As John Maddalena so eloquently put it: “Plan by commitment, not expenditure. In that one statement is captured the entire complexity of the project controls framework and the key difference between projects and business”.

The WBS and Change Management

I have found that if ownership of the Work Breakdown Structure (WBS) as shown in Figure 3 resides within the Cost Engineering or Project Controls team, the propensity to change the WBS is reduced significantly. This is because it can only be changed via a formal signed change request which is then clearly and regularly communicated to all stakeholders via the formal change management process.

It is also strongly recommended that regular formal meetings are held to discuss and sign off on proposed changes and that the business and project decision-makers attend. This stimulates constructive debate and forces a decision there and then with the obvious benefit of a consensus decision and speedy implementation. It must also be noted that each and every change introduced into the change management system has to be prepared, costed and possibly executed by individuals in the team. Studies have shown that the more these changes are introduced the greater the loss of productivity and hence schedule delays occur. Most people will also be reluctant to request changes for change sake as there will be numerous questions asked.

Unplanned or unnecessary scope changes have huge impacts on project cost, time and productivity. A strong change register and change management process is an absolute prerequisite. Studies conducted by IPA indicate that engineering changes, construction changes and schedule delays are primary drivers of increased costs on projects. Don’t underestimate the effect that good change management can have on a project.

A typical Work Breakdown Structure

Figure 3:  A typical Work Breakdown Structure



We are all aware that there are many different forms of contract in the project world ranging from fully reimbursable to Lump Sum Turnkey; see Figure 4. Does the Cost Engineering team understand the pros and cons of each type? Does the team know why a particular type was selected for a particular scope of work on the project? Does the team have access to these contracts at short notice? Even better still, do they have them next to them in case they need to check some contract detail? Does the team understand the implications on the level of detail required in the corporate progress tracking and reporting systems for each type of contract?

The Cost Engineering team must have access to these documents and must also know and understand them intimately. Have them close by at all times. The Cost Engineer/s and the procurement team need to work hand in hand and must assist each other in their daily work. If this relationship breaks down, the project will inevitably be compromised as reporting becomes dysfunctional and difficult.

Remember though that the responsibility for using the contract information for analysis and reporting lies solely with the Cost Engineer and your integrity will be severely compromised if you share sensitive or confidential information with others who have no right to this information, especially labour rates. Never, ever, compromise yourself in this way.

Different contracting options


Figure 4:  Different contracting options


Processes and Procedures

Many hours are spent designing and implementing these standards. They are there to ensure predictability and repeatability which is so necessary in Cost Engineering / Project Controls. The last thing you want is the same set of data giving you five different answers.

Often during the application of these standards on projects, the team may add or improve on these standards if new or innovative ways of executing the work are discovered, or flaws are unearthed in current processes. It is crucial to these corporates or companies that they are able to quickly and smartly gather these new innovations or flaws and build them back into existing standards for the portfolio of future projects to benefit from them. Alas all too often this is not done. Staff members write up the close out manuals highlighting best practices developed on their projects but the manuals are then left on the shelf gathering dust and the staff move on to other projects before proper debriefing has taken place.

Lessons learned and process improvements must find their way back into the respective systems and procedures effortlessly and quickly.

Closing remarks

The five best practices mentioned above have, based on my experience, tended to become difficult issues to manage especially on larger types of projects or programmes. I have, however, found that these practices are crucial for improved project outcomes.

In closing I would also like to suggest that, with the ever increasing availability and functionality of technology, I am convinced now, more than ever before, that the Cost Engineering or Project Controls team needs to have a dedicated database administrator/technical Information Technology (IT) person available to them. The complexities of database set up, reporting and maintenance is taking more and more time from the Cost Engineering/Project Control team who should be doing analytical type work and walking the site instead of trying to fine tune databases and spreadsheets in their offices. Being able to provide specialised, ad hoc, reports in an agile fashion tailored for the project is becoming ever more crucial as project complexities increase.

I hope you have found this article useful and I look forward to publishing further articles on Cost Engineering in the near future.


Maddalena, John – Director, Barsure

If you have any comments or suggestions on how to improve the article, please feel free to contact the author or leave your comment below.