Despite stultifying sandstorms and serious threats from terrorists and high political drama, water and wastewater utilities in Egypt continue to do their best to provide services to their communities. Artesia’s Simon Gordon-Walker has been working with VNG, a Dutch municipal consulting organisation, on an EU funded programme to assist in the improvement of capacity of the national Holding Company for Water and Wastewater (HCWW).  The HCWW is responsible for managing the individual 25 utilities known as Affiliated Companies (ACs) that operate under the local Governorates.

A decade ago the US Agency for International Development (USAID) provided the funding and experts to develop a large data management system called MARS for the HCWW and ACs, from which performance indicators could be derived. Every quarter the HCWW’s Economic Analysis Department (EAD) provides reports covering 65 performance indicators on water supply and wastewater services. They present this information to the HCWW Chairman for discussion with the Chairmen of the individual ACs. HCWW provides feedback through a Chairman’s letter and often this is to point out that current performance is declining in comparison with past performance and something needs to be done to rectify this trend! The feedback is not provided on a comparative basis with other ACs, or at least this comparison between ACs is kept “confidential” to the HCWW so a significant barrier to effective benchmarking exists at present.

Currently benchmarking takes place on an ad hoc basis and it is not main-stream with in the work of HCWW or the ACs, despite their expressed interest. Although the HCWW feedback includes the identification of areas for performance improvement and occasionally would suggest undertaking a benchmarking project – it is a one way system where the HCWW would appoint the benchmarking partners. This is not always helpful given that an important feature of benchmarking is participants’ willingness to participate on a voluntary basis and to be convinced that they will make a gain from learning about best performance from others.  This “top-down” approach does not guarantee that ACs would regard the benchmarking as worthwhile. 


The results of discussions at the workshop held during Simon’s assignment strongly recommended that HCWW assume the role of an active benchmarking “facilitator”. The facilitation role would involve:

  1. The HCWW convening a regular benchmarking workshop involving all the ACs.
  2. At the workshop HCWW would be transparent in the presentation and analysis of comparative performance information, and ensure that appropriate context or explanatory factors were provided.
  3. Performance indicators would be discussed in the context of the ACs themselves identifying areas for which they would be interested and committed to participating in a benchmarking programme.
  4. HCWW would provide organisation and guidance during the benchmarking process and they should look to supporting the implementation of 5 benchmarking projects each year.
  5. The benchmarking programmes should follow the process described in the Process Benchmarking Manual prepared by Simon.

Data reliability and validation were also identified as important issues.  The data provided to the HCWW over the years tended to lack robust accuracy tests and some data is not provided unless the HCWW press hard enough for it.  Although EAD’s staff make many visits to ACs and make effort to resolve data reliability issues, it seems a relentless task; and there still remain problems with the reporting of accurate information. Simon prepared a data validation protocol to assist the ACs with the efficient data submission to the MARS database. This protocol provides an opportunity to identify the source of data and method of data collection.  It provides guidance on the categories of accuracy to be used such as using confidence bands for each type of data source. HCWW will be piloting the data validation protocol with selected ACs to test the appropriateness of developing a formal system for data accuracy.  A trial would be followed by a review and then a roll out across all ACs for a small selection of data items and gradually expand the protocol to cover the whole data input requirements.