Jake’s clients often asked, “Can I perform an online performance test?” — usually to take an easy approach to performance testing and spend less money, both traits that lend themselves to useless performance testing. Jake would reply with the question, “Do you want to do performance testing or performance monitoring?” The usual response was testing.
Jake would then provide the questioner with his list of requirements to ensure a valid online performance test: instrumentation placed in the proper orientation; pulling and verifying or replacing of orifices and confirming they have the required minimum runs; installation of thermowells with proper runs so that uniform mixing will not skew the results; calibration of pressure gauges or sensors; and calibration or verification of any power measurement instrumentation required.
How long does the performance test need to last? In the old days, Jake would go a minimum of one hour with stable readings. To get that stable value, Jake might have to run the test for days. Performance testing a process means the process must remain stable for that period of time to allow using average measurements to determine if the piece of equipment meets design requirements. The one-hour value ensures all individual data collectors were synced and stable. Stable values could warrant a shorter time. Stable values were critical because a process might have some internal cyclical variation that would show up on a one-hour test but otherwise be masked by instrument inaccuracies.
Jake also asked, What are you going to do with the performance test data? Will this be first test of a new or reconditioned piece of equipment? Are you trying to confirm that you got what you paid for? Has the supplier agreed to the methodology for the test?
Based on responses, Jake often would circle back and query, “Do you want to do testing or monitoring?”
Real-World Process Example
Mort listened to all of Jake’s questions and returned to his process area to prepare for the test. He confirmed location and calibration of all of the instrumentation for the required data points. Mort recently upgraded an overheads condenser and wanted to verify that the equipment was performing as designed. He met with Jake and the supplier — they all agreed to the methodology detailed in Mort’s test plan. The test intrigued the supplier; he decided to send one of their engineers to monitor the test.
Jake designed a spreadsheet that would query the process every minute. It would perform a heat and mass balance after each query. It would output the balances in percent of design. It would also compare data points with the design data and give the deviation in percent. These were then checked to see if they were within limits; if not, an alarm was sent.
The day of the test, Mort met with the area operations personnel and explained the goal of the online performance test. He stressed that the process needed to be as stable as possible to validate the test. Operators asked Mort and Jake questions and the more experienced operators provided input and suggestions during the discussion.
Then they began the test. The process cooperated for the most part and as the first sets of data began rolling in there were no alarms on either the data points or the balances. They were able to validate the test; the test process impressed the equipment supplier. And most importantly, the heat exchanger met the design specifications as promised.
One of the engineers asked Jake if the performance spreadsheet could be modified for an online monitoring task. Jake indicated it could but he would need to modify it so that it trended instead of performing the minute by minute analyses.
So, online performance testing is possible and can be set up to confirm or validate the design or equipment after installation. As with all performance testing, the test is only as accurate as the inputs, so, care must be taken to evaluate the installation. Calibration and inspection are still critical elements for the test. Finally, it still takes a team to set up and perform the test. Assistance from operations is critical to the success of the test as they control the process.
Good luck and happy energy hunting.
Earl M. Clark, PE, – Engineering Manager, Global Energy Services. Clark retired from DuPont after a career of 39 years and 11 months and joined Hudson’s Global Energy Systems Group as Engineering Manager. During his over 43 years in the industry, he has worked in nearly all aspects of the energy field; building, operating and troubleshooting energy facilities for DuPont. He began his energy career with Duke Power and Clemson University during the energy crisis in the 1970s.
Active in both, the American Society of Mechanical Engineers and the American Society of Heating, Ventilating, Refrigerating, and Air-Conditioning Engineers (ASHRAE), Clark was chairman of ASHRAE's task group on Halocarbon Emissions and served on the committee that created ASHRAE SPG3 - Guideline for Reducing Halocarbon Emissions. He has written numerous papers on CFC alternatives and retrofitting CFC chillers. He was awarded a U.S. patent on a method for reducing emissions from refrigeration equipment. He has served as technical resource for several others.