Interoperability of Process Simulation Software

logiciels


OVERALL CONTEXT OF SIMULATION IN PROCESS ENGINEERING
It has been said that the ability to learn faster than your competitors is an organisation's only sustainable competitive advantage.Quality, service, technology, price, marketing, and patents are certainly required, but the ability to adapt more quickly than the competition is what sets an organisation apart.The rate at which an organisation can implement its next step forward will determine how long it can sustain any given advantage.Computer Aided Process Engineering (CAPE) has a significant role in allowing companies to rapidly develop and implement improvements in the design and operation of manufacturing plants.
The key business drivers in today's competitive climate and, more specifically, for the use of CAPE are: -Mergers, acquisitions, and divestitures that have resulted in a rapidly changing mix of competitors intensely competing for market share.-The combined effect of rapid change and competitive pressures, which means an increased emphasis on capital efficiency and minimum total cost of ownership (minimised working capital, asset base, and maintenance budgets).-Business changes that have the effect of reducing continuity in the technical community, with people moving between business units/technology areas or between companies.CAPE practitioners become less frequent users of CAPE tools, resulting in a reduction of their specific CAPE expertise.-The use of engineering contractors increases the risk that project objectives might not be translated into desired individual behaviours.-Increasingly strict environmental and health and safety performance legislation.Before going any further, we need to define some terms.We are using the term CAPE to cover all of the activities in which computers are used to assist in the design and operation of processes.Process simulation is often taken to mean that aspect of CAPE in which process models are used to make predictions about the performance of process plants or collections of plants.This implies that process modelling is the building of the process models used for simulation.However, simulation and modelling are often used interchangeably.In the interests of brevity, we will therefore use the terms simulation or process simulation to encompass the activities of creating process modelling components, combining them to represent an industrial process and using the resulting overall model to study some aspect of the performance of the process.

BUSINESS CASE FOR INTEROPERABILITY
In this environment, it is clear that process engineers will place heavy demands on CAPE software capabilities, as they strive to deliver the necessary quality and speed of response.At the heart of this endeavour is usually the need for a highfidelity process model to enable meaningful process simulations to be carried out.The range of technical capability needed for a given process simulation is often so great that no single provider is likely to deliver best-in-class in every area.This requires the use of Process Modelling Components (PMCs) from multiple sources in the chosen Process Modelling Environment (PME).This situation makes complete interoperability a requirement for taking full advantage of the possibilities of CAPE.Here, complete interoperability is the ability of a PMC to run as if it were an integral component within any PME.It provides full "plug and play" operation, so that there is no longer any need for a different version of a PMC for each PME, or for the interface to be revised each time a new version of the PME is issued.From this point on, when we use the word interoperability, we mean complete interoperability as described above.
To achieve interoperability, PMC and PME providers must adhere to a single, widely agreed interface standard.Use of such a standard can: -Reduce total effort required through the whole software cycle -creation, maintenance, and updates.-Enhance the relationship between universities and industry by facilitating the transfer of simulation technology between them.-Stimulate universities, software developers, equipment vendors, etc., to provide an increasingly diverse mix of simulation components.-Improve ease of use.
-Give users access to specific equipment vendor process models for perhaps detailed design (if vendor is preselected) and certainly for operations, training, control, and optimisation.-Give users access to best-in-class simulation components for all CAPE activities.-Allow each component provider (academic, software vendor, equipment supplier, etc.) to focus on their own expertise, which promotes the production of best-in-class software components.
There are many examples of environments and components where interoperability is applicable.The traditional process simulator is the obvious PME, but CAPE covers the entire process engineering arena, not just process simulation.Furthermore, the end users of CAPE tools extend beyond the process engineers themselves into other technical disciplines (e.g.chemists).
In this broader context, possible PMEs include: -Custom applications built using, for example, a spreadsheet.Possible PMCs that would be required in this environment would be: • physical property systems; • single detailed unit operations.An example of this could be a reactor model for use by chemists in design of experiments and data analysis.-Any process design application that requires access to physical properties.Currently such properties are either provided by the user in the form of simple tabular data or through the application's own, and perhaps limited, proprietary physical property system.However, it would be much better if all such applications could use the standard physical property system in use within the company.A specific, sophisticated example of this would be the modelling of electrolyte systems.
Note that a single application may be either a PME or a PMC depending on the usage and context.For example, a stand-alone unit operation application may require the use of plug-in physical properties, in which case it acts as a PME.If the same application is used as a unit operation in a process simulator, then it acts as a PMC.
However, there are many benefits that can be obtained through the application of CAPE that currently are not routinely captured.Looking at the process plant life cycle, from process concept through to operations and beyond, shows where such benefits can be obtained and how interoperability can contribute.

Design and Revamp
It is possible to deliver process designs that are more capital efficient, i.e. that require less capital, or use the same capital more effectively.The value of process simulation tools here is in: -enabling leaner and more agile project teams to use concurrent engineering, in place of the traditional sequential design process; -rapid and efficient flow sheet optimisation.
This can be done through formal or informal process synthesis, optimisation, and dynamic simulation.These techniques are not all routinely used, but offer significant potential to deliver process designs which are safe to operate, cost effective, and easy to control.The use of interoperable software components means that the models needed for these activities could be assembled quickly from components in the market place, without the need for custom software.
Interoperability at the design stage can also mean that process synthesis, optimisation, and steady-state and dynamic simulation are less likely to be stand-alone exercises, but rather integrated with each other and the design process.This approach offers significant benefits: -An integrated approach to assessing the operational impact of process constraints against the cost of removing them by equipment substitution.-A framework within which to assess the robustness of the design in the face of uncertainty in the basic design parameters and assumptions.-Integration of the process and control system design, enabling the design to be produced faster and with greater assurance that the process and control schemes are optimised.-The ability to prospectively examine the dynamic behaviour of the plant and proposed control scheme, thus ensuring the process will operate as intended and is controllable.-Process designs that minimise impact on the environment.
This can be achieved through the reduction in energy usage, the minimisation of emissions, and the production of designs that can cope with upset conditions in the most environmentally friendly way.This approach is already being trialled in BP, with synthesis, simulation and optimisation systems sharing models in the exploration and production area.

Training
Plant start up can be facilitated through the provision of a training simulator based on the process simulations developed in the design phase.The training simulator is also valuable for troubleshooting and to improve the understanding of the dynamics of the process.It subsequently becomes a tool for training replacement operators and to improve the capability of the existing operators in situations of abnormal operations.
Although training simulators are becoming more common, they are still not developed for every new plant.The most cost-effective training simulators are based on the models used during the design process, but this is definitely not current standard practice.The result is that a great deal of rework is needed to generate an entirely new model for the training simulator.Interoperability and the use of interoperable dynamic simulation components can greatly simplify the rapid development of a high-fidelity training simulator.

Operations
Process troubleshooting and performance monitoring are perhaps the most common applications of process simulation in operating plants.Process simulation is commonly used to investigate causes and possible solutions, when the plant is unable to deliver the performance required.Being able to add or delete interoperable high-fidelity components in existing simulations offers understanding of the process as it evolves.
Most industrial companies routinely use process simulation in this way and undoubtedly generate benefits from the process.Significantly, some companies are now using process models in a more proactive way to routinely monitor equipment performance.The existence of interoperable simulation components allows the simulations to be continuously updated with the latest best-in-class elements.In particular, interoperability allows: -Vendor models of the exact equipment installed in the plant to be easily included in the overall plant simulation.-Easy update of the process simulation with the latest software vendor components.-Transport of modelling components from one PME to another.-The latest academic advancements to be rapidly incorporated in process simulations.
Going beyond performance monitoring and operational troubleshooting, there is an opportunity to use process models to optimise plant performance to generate increased profit for the business.The benefits of on-line, model-based optimisation have been robustly demonstrated in a number of areas, for example ethylene plant optimisation.Here, technologies from a number of suppliers have received wide application, often linked with proprietary models developed by the operating company.If interoperable components exist, new ones can be added and legacy components can be upgraded easily when the plant capability must be enhanced.
Optimisation models, whether on-line or off-line can be kept current with interoperable components.Off-line optimisation can be used to make step changes in the operation of the plant, outside the current range of operation of the process.For example: -An unpublished Dow project resulted in an average 18% reduction in distillation energy requirement and a comparable increase in tower capacity.These towers represent a broad range of product types including speciality chemicals and commodity hydrocarbons.
In BP, a 5% improvement in throughput was obtained at an oil production site by optimisation of operational set points.In addition, at the same site, a significant summer production shortfall, caused by air cooler limitations, was eliminated.In both cases, the operating strategies proposed by the optimisation were counter to normal practice, but, after the simulations were understood, fell firmly into the category of "Why didn't we think of that?"Of course, for off-line optimisation to be used in this way, it is vital that the process simulation represents the actual equipment installed on the plant, with all its limitations.As we have discussed above, this is more likely to be achieved if software components representing the actual pieces of equipment can be readily plugged together.Likewise, applying the latest theoretical understanding of unit operations from the academic community can bring a model to a higher level.
Other solutions involving custom software are usually too slow and expensive, so potential benefits can go unrealised.
There are many plants for which neither off-line nor online optimisation is necessary or even currently feasible.In these cases, the creation and maintenance of a model is often sufficient to provide significant process understanding and thus improve the operation of the plant.An unpublished example from the chemicals stream of BP describes the use of such models, which resulted in $0.65 M/a operational savings, whilst also reducing effluent by up to 50%.
While the examples quoted did not, in fact, depend on interoperability to deliver their benefits, the creation of the necessary high-fidelity models would have been much more efficient had it been available.Model maintenance will also benefit greatly from interoperability of software components.

Control
Although process control is not traditionally considered CAPE, proper plant design fully integrates process control and process engineering.There are therefore many areas where interoperability can be used to assist the control engineer.The use of simulation to aid the control system design has already been described, but there are other synergies to be obtained, especially in the area of advanced control.
Well-maintained process models can be used in Advanced Control algorithms.Such reuse is almost certainly not "as is", because of the speed requirements for the models used in control, but simplified models can be generated based on the process simulation.
Process models can be used during the plant trials ("step tests") required to tune the advanced control system.This can either be done by predicting which plant trials are required, thus avoiding the need for unnecessary trials, or even by replacing some of the trials with off-line modelling.Having the most accurate and high quality model for this application is critical, and, as we have discussed, the use of interoperable components supports the creation and maintenance of the best possible process simulation.

Decommissioning/De-rating
Decommissioning is of particular interest in the oil and gas sector, when a field is near the end of its life.However, derating in response to plant throughput reductions, or where there is a cyclical demand, is of interest in all sectors.
The optimal decommissioning/de-rating of plants is an area where process simulation tools can contribute significantly.By the final period in a plant's life cycle, if it has not benefited from the advanced techniques described above, the original process simulation is likely to be hopelessly out of date.Equipment is often operating in a different mode and far away from its design point, so substantial revisions may well be needed to achieve a realistic simulation.However, the plant is at a time in the life cycle when the focus on cost containment is at its sharpest.The reduction in effort required to create or update the models, if interoperable components are used, could therefore be very significant.
During decommissioning/de-rating there may be a loss of processing flexibility, which will usually affect the ability to respond to plant upsets and equipment failures.Safety must be assessed, as with any process change.These complex issues all have to be included in any study, which is really only feasible with accurate, flexible process models.As before, the production of these models can be facilitated by the use of interoperable components.

IMPACT OF INTEROPERABILITY ON THE BUSINESS CASE
All of the above activities have been done to varying degrees for many years, both with and without CAPE.However, it is clear that modern CAPE tools have brought a significant increase in capability to these tasks, making some of them a practical reality, rather than a theoretical possibility.The generation of counter-intuitive optimised solutions falls firmly into this category.Interoperability brings a further step change in capability.With interoperability, the process engineer has access to the complete range of best-in-class PMCs and is therefore more likely to construct the desired highfidelity process model.Without interoperability, the effort required to construct the model can be significant, and thus it is less likely that the model will be built, or that the model will be of the desired quality.As project timescales shrink, this becomes more and more of an issue.We cannot overemphasise the impact that reducing this barrier has on the likelihood that the project will ultimately deliver benefits.Having to write just one custom component can be enough to abort many industrial projects.Interoperability offers ease of use, speed, accuracy, and quality control during model construction.This includes the ability to use detailed models from vendors, equipment suppliers and academics that not only represent the general unit operation, but incorporate the specifics of a particular hardware design, catalyst performance, etc.The ability to easily bring this level of sophistication to model development means that interoperability becomes an enabling requirement for high-fidelity modelling.

THE CAPE-OPEN INITIATIVE
It is clear that an agreed set of interface standards is a prerequisite for successful interoperability.This was both the goal and the accomplishment of the CAPE-OPEN initiative.

Background
Internal discussions at BP in the early 1990s identified the need for interoperability standards to allow the development of advanced process analysis tools, without having to maintain an in-house flowsheet simulator.Brief discussion at the FOCAPD conference in 1994 showed that BASF also shared these requirements, but, at the time, there did not seem to be a great deal of wider interest.However, both companies independently developed the idea, BP via the European Union project PRIMA and BASF via the German consortium IK-CAPE.When both sought to submit research project proposals to the EU, they were amalgamated into a single proposal that became CAPE-OPEN.This finally started in January 1997 and was succeeded in 1999 by the Global CAPE-OPEN project.This was also an EU project, but this time it was supported by IMS as well, which gave it a global reach.By the end of Global CAPE-OPEN, the major simulator vendors had committed to the inclusion of CAPE-OPEN facilities in their products and public demonstrations of practical interoperability had been given.

The CAPE-OPEN Standard
The CAPE-OPEN reference model [11] shown in Figure 1 illustrates the components, interfaces and communication protocols that comprise the CAPE-OPEN standard.The standard is based on an object-oriented model and assumes that a process simulation tool can be made of several components.
Communication between these components is carried out using Microsoft COM or OMG CORBA middleware technology.The business interfaces include such process engineering applications as physical properties, unit operations, numerical solvers and flowsheet analysis tools.Common interfaces support general functions such as identification, error handling and parameter exposure.CO-compliant software components are divided into PMCs and PMEs, where a PMC is a plug-in component, such as a distillation model, and a PME is an environment that accepts a PMC, such as a flowsheet simulator.

Current Status
The future of the CAPE-OPEN standard is now in the hands of the CO-LaN (CAPE-OPEN Laboratories Network).This is a not-for-profit organisation funded by subscription from process industry company members, with a wide associate membership of software vendors and academics.The CO-LaN is committed to maintain and extend the CAPE-OPEN interface in the light of experience with implementation and the changing needs of the process industries.Full information is contained on the CO-LaN web site www.colan.org(Fig. 2).
All of the major process simulator vendors have committed to include CO-compliance in their products.Recent reports by CO-LaN members have shown that commercial releases are now providing good interoperability performance.For example, the physical property packages used in AspenPlus and HYSYS are both available as CO components, which means that users of these simulators can now use their familiar physical property models for specialist studies in a simulator such as gPROMS from PSE.Alternatively, process engineers could use an independent thermodynamic package, such as Multiflash from Infochem, in their compliant PME of choice.Similarly, a unit model developed in gPROMS can be wrapped as a CO unit operation PMC for specialist studies in any compliant PME.These are just a few of the CO-compatible products available in a market that is beginning to change quite rapidly at the time of writing.
It is clear that we have reached a milestone in the development of interoperability standards for process simulation, now that compatible products are commercially available.
Figure 2 CO-LaN Website Like all powerful new developments, CAPE-OPEN interoperability requires responsible use and in the remainder of this paper we look at some of the issues that it raises for individuals and organisations.

Legal/professional
In most countries, professional activities are increasingly governed by legislation, such as, for example, the United Kingdom Health and Safety at Work Act.Although the details may differ from one country to another, what this basically means is that whatever the sophistication of the computer programs used, the responsibility for the decisions made remains firmly with the individual engineer and his/her organisation.We therefore need to examine the requirements that this implies for CAPE tools, such as flowsheet simulators and equipment design tools.
In the case of common tools, the guidelines developed by the UK Institution of Chemical Engineers provide a useful discussion of the issues involved in the responsible use of CAPE tools (see the Good Practice Guidelines at: http://www.icheme.org/enetwork/MainFrameset.asp?AreaID =172).These guidelines cover the checks necessary before using the results of CAPE tools and hence imply the facilities that the tools need to provide to allow this to happen conveniently.These facilities can be summarised as follows: -Thermophysical data; the origin of the data, its range of applicability, quality of fitted model and extrapolation properties should all be easily available.-Engineering model; the range of applicability MUST be clear.-Input checking; it must be simple to check input data.
-Results checking; it must be simple to check mass and energy balances at all levels of detail.Error messages must be clear and unambiguous.-Convergence criteria; they must be clear, as must the status of convergence.-Sensitivity analysis; it must be simple to apply to the overall solution.
With CAPE interoperability, there is an added layer of concern and complexity when using PMEs and PMCs from a variety of sources.The individual performance of each PMC and PME must be considered as well as their behaviour when they are linked.The suppliers of the software components can be held responsible for the suitability of each PME or PMC both for the designed tasks and for compliance with interoperability standards.However they cannot be responsible for the details of interactions that may occur in a specific model configuration.It is incumbent on the user to appropriately confirm that the assembled simulation performs as required.

Documentation
In the business environment described earlier, one of the trends is the likelihood that more changes of personnel will happen in the course of a project and especially in the life of a plant, than used to be the case.In addition, these people are likely to work for a wider variety of companies than before, as outsourcing becomes more prevalent.A consequence of this is that good "organic" documentation becomes highly desirable for any model/program/design that is likely to be used or modified by more than one person.Here "organic" documentation means documentation that is intimately associated with the model/program/design and that is updated whenever anything changes.It will almost certainly be different from formal project documentation, which tends to be rather more static.It should, for example, record all changes and the reasons for the changes along with the date and the person who made the change.Without this information it is difficult to judge, at a later date, whether it is safe to make changes, with the result that it is often regarded as easier to start again, or just to leave things as they are.Neither of these options is necessarily the best for the business.Interoperability does not alter the need for such documentation; but it does provide another facet of the project that needs to be recorded.

Organisational
There are two extremes of organisation relating to the use and quality assurance (QA) of CAPE tools, in which: -Teams (and individuals themselves for that matter) are free to choose the tools that they use and can get these tools from any relevant source.-All tools are tested and approved centrally, and then provided to the end user as "fit-for-purpose" for use within the entire organisation.We therefore need to look at the impact of having interoperability between PMCs and PMEs on these two organisational structures.
One of the strong selling points for interoperability is that an independent supplier (be it an equipment manufacturer, software vendor or academic body) can very easily make their CAPE tool available to a wide audience.The end users can then use such tools within their normal CAPE environment "as is", without having to worry about developing custom interfaces to link the various components.For example, in principle a user with a specific CAPE need can make a search on the Internet to find a suitable component, immediately download the component and plug it into a standard process simulator.
In an organisational structure where the individual is permitted (and possibly even encouraged) to do this, the responsibility for QA of the component itself and accountability for its use resides primarily with the individual.Such QA will need to cover all of: -The engineering content of the component.
-The implementation details of the component; • especially in the case of interoperability, the ability to trap and report errors.
-The adherence of the component to the relevant interface standard; • it is possible that an independent certification process will exist in the future.This would provide a statement such as "this component is certified as adhering to the interface standard".If this is the case, then independent testing by each user may not be necessary.Such a process could also aid developers in rapidly bringing software components to market, by providing a clear statement that the component is compliant.
-The practical interoperability of the component within the CAPE environment in use; • even though all the interfaces adhere to the standards, it is still possible that there may be specific interactions that are not addressed by the standards.For example, a physical property system can calculate the existence of more phases than are supported by the PME in which it is being used.
-The suitability of the chosen tools for the specific conditions and systems in each project.
-Documentation of all of the above.Note that such QA is always required, even where an integrated system from one supplier is used.The complexity will be greater with components from multiple suppliers, but will not necessarily be significantly more onerous.
In this type of organisation, there is therefore a strong requirement that management should communicate a clear and consistent corporate policy on acceptable risk in the use of CAPE tools.Furthermore, the organisation needs to provide and promote rigorous work practices that assist individual engineers to exercise their professional responsibility, whilst not restricting their ability to use their own judgement on the suitability of available components.This is the case for the use of CAPE in general, but it becomes even more important with the easy availability of interoperable components from many different sources.
At first sight, interoperability will have less impact on the work practices of a "centralised QA and approval" type of organisation.Both the end users and the central group will have all the advantages of easy access to best-in-class components described above, but the components will not be made generally available until the necessary QA has been undertaken centrally.While there will be many more components that can be used, the work processes employed to QA them will be no different from those already required in such companies.
However, a possible complication is that the "organisation" referred to here may be larger than a single company.It is very common practice for operating companies, for example, to outsource large portions of their engineering and CAPE activities to companies with specialist knowledge.In this case, it is essential for the outsourced suppliers to exercise the same level of care in the use of CAPE components as the companies themselves.In fact, it will be the responsibility of the operating companies to ensure that their QA requirements are met by all outsourced suppliers using CAPE.The exercise of this responsibility may need more attention as the market for interoperable components grows.
As a final comment on the organisational impact of interoperability, the organisational extremes described at the beginning of this section do not actually alter the QA actions required, they simply alter who executes them.

Technical
The technical impact of interoperable process modelling components and environments should be low, as long as the suppliers of both PMCs and PMEs have done their job properly.This is because the components should simply appear as additional options within a PME, which the user can select in the same way as any native component.
One scenario where there will be an impact is when the functionality of a PMC is greater than that of the PME in which it is being used, despite the interface between them being correct.An example of this has already been described, where a property package can calculate the co-existence of more phases than is supported by the PME.In this case, a decision needs to be made by the PME and PMC developers on what action to take if this scenario occurs.
The major technical impact is on the suppliers of the PMCs and PMEs and on those responsible for the maintenance of the standards (CO-LaN).A number of requirements for such suppliers and the standards organisation are identified here.If implemented, these will increase the likelihood that interoperability will actually be achieved.
The first requirement for interoperability is that a PMC or PME must fully implement a specific standard.Without this, a component that "implements" a standard may not work with a PME which also "implements" the same standard, since one or other may not have the required parts of the interface available.However, this also implies that interoperability will be easier to achieve if each standard is as specific as possible, i.e. the number of methods that need to be implemented is low.This will make it simpler for a PMC or PME supplier to fully implement the standard and hence guarantee interoperability with any other PMC or PME which also implements the standard.
The second requirement is that the specific implementation of an interface standard must strictly adhere to the standard.A misinterpretation of the standard by the supplier, or an implementation which contains bugs, will mean that interoperability may not be achieved.This will be easier to achieve if "testers" exist that fully and rigorously test an interface implementation in both PMCs and PMEs.Such testers are therefore a necessary deliverable from the standards organisation, along with the standards themselves.
The final requirement identified here is that the code of either a PMC or PME must be written to a very high standard.This is because, since source code of all the components is no longer held by a single supplier, it is very much more difficult to find and correct any bugs that exist.This is especially true when the symptoms of a bug do not appear in the component which contains the bug.An example of this is where the code for a PMC contains a "memory over-write".In this case, the bug may or may not cause an error, depending on the PME with which the component is used.Furthermore, even if an error does occur, the exact symptom(s) of the bug may vary from PME to PME, or even from model to model within a single PME.

THE FUTURE
In the medium to longer term, we envisage that process simulation and modelling activities will broaden in scope and encompass more and more of the business environment in which the plant operates.This will be particularly so where optimisation techniques are used.Thus being able to plug the process model into a wider business model, or plug financial and supply chain functions into a process simulation will become even more attractive.The market for specialised components should grow rapidly with the removal of the barriers caused by of incompatible interfaces.Interoperability affords the prospect of more appropriate simulations, delivered more quickly with a lower lifecycle cost.
Another trend in certain segments of the industry is towards smaller and more flexible plants.The rapid changeover of products, recipes, and even processes in such facilities means that better, faster, and easier to build/use dynamic models will become more of a requirement for successful operations.Such models will be required not just for plant design, but for safe and efficient operation, making use of dynamic on-line simulations for process control.Interoperability will significantly ease the production of such models.
At the same time that some parts of the process industries move to smaller, more flexible facilities, elsewhere truly world scale plants will become ever larger.The same issues of safety and efficiency through on and off line dynamic modelling apply.In the extremely large facility, fractional percentage improvements result in significant improvements in financial performance.Simultaneously, the risks of process upset and poor control have increasing consequence.
These requirements specifically imply ever increasing computing power to accomplish the modelling and to support the added demands resulting from interoperability and ease of use.Model complexity and computer processing power have existed in a "chicken and egg" relationship for decades, with the modeller always pushing at the limits of computing power, and computer hardware vendors leapfrogging their own technology every couple of years.The implementation of a sophisticated GUI (graphical user interface) and the requirement to be able to "plug and play" interoperable components each add their own demands on the computer hardware.Future hardware may manifest itself in such advanced techniques as parallel processing, virtual computing networks, and beyond.
In light of these issues, once the market for interoperable CAPE software components gets beyond a certain size, it will become difficult to keep track of all the possible options available to solve a particular problem.The CO-LaN web site www.colan.orgwill still be able to supply component and supplier names and it could allow users to post comments and reviews on their experiences with components and simulators.CO-LaN is unlikely, however, to be able to review in detail the technical capabilities of the software components.Perhaps limited downloadable trial versions will become more common.
As the market increases further, general web search engines could help to locate suitable components and later active agents could search the Internet for solutions to problems posed in engineering and business terms [12].Thus we can envisage a scenario in which the building blocks for a powerful and appropriate simulation can be assembled and implemented quickly and easily, given a physical description of the engineering or business problem to be solved.
Of course, once the appropriate simulation has been created for the application in hand, the QA problems already identified will remain.In those applications with significant safety or financial implications, there is no simple alternative in sight to detailed review by a responsible professional engineer or team.What interoperability does is to simplify those parts of a job that are basically mechanical and so allow the engineer to concentrate on the creative and critical aspects.Perhaps there is no need to seek an alternative to this scenario.

CONCLUSION
This paper has examined the business case for process simulation software interoperability, reviewed the current status of the CAPE-OPEN interoperability standard and assessed the current and future markets for interoperable components.It has presented a predominantly industrial viewpoint.
It is clear that the widespread acceptance of interface standards and the availability of compliant components will strengthen the contribution made by process simulation software to the process industries.It will enable high-fidelity process models, encompassing a broader business scope, to be created more quickly and more cheaply by stimulating the availability of best-in-class components from a wide variety of suppliers.The impact will be felt across all aspects of CAPE activity.
As with all new technologies, interoperability will provide some challenges to the organisations that adopt it.In particular, quality assurance procedures will need to be reviewed at both a corporate and individual level to ensure that the implications are fully understood.
The current status is that the CAPE-OPEN standard has been adopted by the major process simulator vendors and by many component developers.Commercially supported software is now in the marketplace, with new examples appearing regularly, and an organisation (CO-LaN) is in place to maintain and promote the standard into the future.
Interoperability of process simulation components is now a practical reality; CAPE practitioners can therefore begin to reap the benefits.

Figure 1
Figure 1 Reference model.