Tuesday, 5 November 2013

Applying Quality Methods to Water Treatment--Environment -equipment- materials =measurment, method , manpower

Typical utility water systems are subject to considerable variation. Makeup water characteristics can change over time. The abruptness and degree of change depend on the source of the water. Water losses from a recirculating system, changes in production rates, and chemical feed rates all introduce variation into the system and thereby influence the ability to maintain proper control of the system. Other variables inherent in utility water systems include:
water flow/velocity
water temperature
process/skin temperature
process demands
evaporation rates
operator skill/training
water characteristics (suspended solids, hardness, pH swings)
treatment product quality

These variables are considered and introduced during the applications and pilot plant testing of new products for the treatment of various water systems. Pilot plant simulation of actual operational variation is a challenging task. Every industrial water system is unique, not only in the production operations it supports and the sources of water it receives, but also in the degree of inherent variation encountered due to the factors listed above. While a very sensitive treatment program that must operate within a narrow control range may be suitable for one system, another system requiring the same degree of protection may be incapable of maintaining the required control. Consequently, inferior results must be accepted unless the system is improved to support the sensitive program.
In operating systems, proper treatment of influent, boiler, cooling, and effluent waters often requires constant adjustment of the chemistry to meet the requirements of rapidly changing system conditions. A well designed program is essential to maintaining proper control. The program should include proper control limits and the ability to troubleshoot problems that interfere with control of water chemistry. Success in troubleshooting depends on the knowledge, logic, and skills of the troubleshooter. In order to improve operations it is necessary to recognize the importance of continuous improvement and to be familiar with some tools and procedures necessary to support this effort.
Adequate and reliable data are essential if variation in a system is to he measured and reduced. Specialized computer software can assist efforts to manage, summarize, and use data effectively. Process data can be stored in a database and retrieved and analyzed as needed in a variety of formats. Computers provide nearly instantaneous access to many months or years of process data that would require several filing cabinets if stored on paper log sheets. The computer can he used to graph and analyze the data in a variety of formats, such as statistical process control (SPC), trend analysis, and histograms. The operator is able to troubleshoot the system based on these analyses without spending large amounts of time manually researching and analyzing the data. In his classic hook Managerial Breakthrough (McGraw Hill: New York, 1964, pp 1-14), Dr. J. M. Duran develops the important distinction between quality control and quality improvement, and describes the elements of effective problem solving in each case. These distinctions and relationships are summarized in Figure

 Quality improvement program sets new standard for feedwater hardness.
QUALITY CONTROL ZONE
Although the performance of a process varies from day to day, the average performance and the range of variation are fairly constant over time. This level of performance is inherent in the process and is provided for in the system design. The Quality Control Zone in Figure 3-2 depicts the accepted average and accepted range of variation in feedwater hardness. This zone is often adopted as the standard of performance. Sometimes, performance falls outside the accepted, or standard, range of variation in the Quality Control Zone. This is depicted in Figure 3-2 by the sporadic spike. The goal of problem solving in the Quality Control Zone is to reestablish performance within the standard. This involves the following steps:
detecting the change (sporadic spike)
identifying the cause of the change
taking corrective action to restore the status quo

QUALITY IMPROVEMENT ZONE
Problem solving in the Quality Improvement Zone (also depicted in Figure 3-2) can have an even greater impact. The goal of quality improvement is to reject the status quo as the standard and reach a level of performance never before achieved. This level, the New Zone of Quality Control," represents the achievement of lower costs and/or better performance. In this case, significantly lower feedwater hardness decreases scaling potential and improves boiler reliability.

This step extends the scope of problem solving beyond the correction of obvious problems. While it is important to "make the system work," it is often more important to view the entire system to identify areas of potential improvement. Some systems are poorly planned; others have not been updated to keep pace with changing requirements and progressing technology. In either case, it is often the system that causes control and operational problems not the people working within the system.
Quality Improvement Tools
While a proper mindset must exist for continuous improvement, certain problem solving procedures and tools can add structure and consistency to the effort. The following quality improvement tools provide the means to summarize and present meaningful data in a way that adds significance to the successful resolution of chronic problems.
Flow Diagrams. A flow diagram provides a graphic presentation of the steps required to produce a desired result. For example, this tool may be used to clarify the procedures used to regenerate a softener or the steps to be taken in the event of an upset in a cooling tower. Flow diagrams are used in problem solving to give all parties a common understanding of the overall process.
Brainstorming. In diagnosing a problem, new and useful ideas can result when all of the people familiar with the process meet to share their experiences and ideas. Possible causes are discussed and possible solutions are presented and evaluated.
Cause-Effect Diagrams. An important first step in quality improvement is the identification of the root causes of a problem. A cause-effect diagram provides an effective way to organize and display the various ideas of what those root causes might be. Figure 3 graphically presents possible causes for reduced demineralizer throughput.
 

Scatter Diagrams. A scatter diagram is useful in providing a clear, graphic representation of the relationship between two variables. For example, boiler feedwater iron levels might be plotted as a function of feedwater pH to confirm or rule out a cause-effect relationship.
Pareto Analysis. Pareto analysis is a ranked comparison of factors related to a quality problem, or a ranking of the cost of various problems. It is an excellent graphic means of identifying and focusing on the vital few factors or problems. Figure 4 represents an analysis of the calculated cost of various problems interfering with the successful management of a utility water system.
 
 Pareto analysis helps identify the vital few factors that have the greatest impact on water system performance.
Meaningful Data Collection. Meaningful collection of data and facts is fundamental to every quality improvement effort. Quality improvement is an information intensive activity. In many cases, problems remain unsolved for long periods of time due to a lack of relevant information. A good data collection system must he carefully planned in order to provide the right information with a minimum of effort and with minimal chance of error.
In order to plan for data collection, it is necessary to identify potential sources of bias and develop procedures to address them:
Exclusion bias. If a part of the process being investigated has been left out, the result will be biased if the data is intended to represent the entire process. For example, if data on attemperating water purity is not included in an evaluation of a steam turbine fouling problem, the cause could be missed.
Interaction bias. The process of collecting the data itself can affect the process being studied. For example, if an operator knows that cooling tower treatment levels are being monitored by the central laboratory, he may be more careful conducting his own tests.
Perception bias. The attitudes and beliefs of the data collectors can influence what they perceive and how they record it. If an operator believes that swings in steam header pressure are his responsibility, he may record that operation was normal at the time of boiler water carryover.
Operational bias. Failure to follow the established procedures is a common operational bias. For example, failure to cool a boiler water sample to 25 °C (77 °F) often leads to an erroneous pH measurement.

Graphs and Charts. Pictorial representations of quantitative data, such as line charts, pie charts, and bar graphs, can summarize large amounts of data in a small area and communicate complex situations concisely and clearly.
Histograms. The pictorial nature of a histogram (a graphic summation of variation in a set of data) reveals patterns that are difficult to see in a simple table of numbers. Figure 3-5(a) is a histogram that shows the variation of inhibitor level in a cooling water system. Each bar along the horizontal axis represents a specific range of inhibitor concentration, in parts per million. The scale on the vertical axis represents the number of occurrences within each range of concentration. The shape of this particular histogram indicates a normal and predictable pattern of distribution. There are no incidents of nonconformance outside of the specified tolerance limits of 60-80 ppm, represented by the dotted lines.
In contrast, the patterns of variation depicted in Figure5(b) and (c) represent problems, which must be corrected.
 
The pattern of distribution in Figure 3-5(b) is relatively normal, but a few incidents of nonconformance occur outside of the engineering limits, departing significantly from the otherwise normal distribution. The cause of these occurrences must be investigated, and the process corrected to a more predictable pattern. Figure 3-5(c) represents a normal and predictable pattern, but reveals several occurrences that fall outside of the specified 60-80 ppm limits, indicating that there is too much natural variation in the process.
Statistical Process Control. Statistical process control (SPC) is the use of statistical methods to study, analyze, and control the variation in any process. It is a vehicle through which one can extract meaningful information about a process so that corrective action, where necessary, can be implemented. While a histogram is a pictorial representation of patterns of variation, SPC is used to quantify this variation and determine mathematically whether the process is stable or unstable, predictable or erratic. Figure 3-6 shows three SPC charts of the individual values of measurement used to construct the histograms in Figure 3-5. In these cases, the data is plotted chronologically and used interactively to determine whether a value falls outside of the statistical (predictability) limits.
With statistical process control, the actual historical data is used to calculate the upper and lower statistical limits as a guideline for future operation. Anything falling outside of the statistical limits is considered to be a special cause of variation requiring immediate attention. Of course, if the common causes of variation are excessive for either engineering or economic reasons, as is the case in Figures 3-5(c) and 3-6(c), improvement to the process is necessary until the statistical limits are narrowed to the point of acceptability.

No comments:

Post a Comment