What are the Needs for Process Intensi ﬁ cation ?

— Process intensi ﬁ cation is now well known among the world of chemical engineers. The development of new chemical routes and of innovative modular technologies may lead to some breakthrough progress. Some successful stories have been nowadays demonstrated. Particularly, batch-to-continuous is one of the key issues for intensifying processes. But, the need of process intensi ﬁ cation requires the need of collecting many new data of high quality relative to the chemical reaction and to the equipment as well, which represents sometimes a huge effort. This discussion is the opportunity to review the basic data and the metrology associated for the characterization of applications and technologies in order to guide the design of the intensi ﬁ ed process. Concerning the chemicals, the stoechio-kinetics parameters, the phase equilibria data (for instance solubility) or the reaction enthalpies can be ef ﬁ ciently determined thanks to micro ﬂ uidic devices. Concerning the equipment, benchmark studies have to be performed in view of comparing the technologies according to pressure drop, residence time distribution, heat and mass transfer performances, mixing ef ﬁ ciencies and so on. A way of capitalizing the background knowledge lies on simulation tools development, here called data processing tool. This presentation intends also to introduce some of the European


INTRODUCTION
Process Intensification (PI) is now well known among the world of chemical engineers (Stankiewicz and Moulijn, 2003;Cybulski et al., 2011;Reay et al., 2013;Boodhoo and Harvey, 2013). The development of new chemical routes and of innovative modular technologies may lead to some breakthrough progress (European Process Intensification Roadmap, 2007). Some successful stories have been nowadays demonstrated. Particularly, batch-to-continuous is one of the key issues for intensifying processes (Hellier et al., 2010;Anxionnaz et al., 2010;Elgue et al., 2012;Shen et al., 2014;Martin et al., 2014).
But, the need for process intensification requires the need for collecting many new data of high quality relative to the chemical reaction and to the equipment as well, which represents sometimes a huge effort. This requirement imposes to make some innovative advances in experimental, methodology or simulation tools. Not only the process has to be intensified but also the process design methodology.
We intend in this paper to present the methodology of process intensification followed in our lab and to review the basic data and the metrology associated for the characterization of applications and technologies in order to guide the design of the intensified process.

Definition
Since the 80s, there is an impressive number of definitions diverse and various of the PI concept (Ramshaw and Arkley, 1983;Cross and Ramshaw, 1986;Stankiewicz and Moulijn, 2000;Tsouris and Porcelli, 2003;Charpentier, 2007;Van Gerven and Stankiewicz, 2009). If one refers to one of the last definitions, proposed during the elaboration of the European Roadmap of Process Intensification in 2007, which expresses itself in the following way: "Process Intensification (PI): a set of often radically innovative principles ("paradigm shift") in process and equipment design, which can bring significant (more than factor 2) benefits in terms of process and chain efficiency, capital and operating expenses, quality, wastes, process safety", it is easy to notice that this definition bases more on the notion of breakthrough than of incremental innovation. We could try to summarize this objective in the following way: PI simply means "using much LESS to produce MUCH more and BETTER", in which "less" is related to investment, space, time, raw materials, energy, inventory, and so on, and "much" refers to factors or orders of magnitude. This is indeed an attractive slogan, but very requiring and challenging.

Challenges Linked to Process Intensification
What are the challenges linked to PI? There is a need to identify where the limiting steps are exactly located in the process: is this chemistry, transport phenomena, or equipment? Therefore, it is required to perform an accurate diagnosis of the process. If the process is chemistry dependent (called the chemical regime), the key issues are catalysis, or new chemical routes or new operating windows (temperature, pressure, concentration).
If it is transport phenomena dependent (called the diffusional regime for heat or mass transfer), the key issue is linked to technology or, possibly to new operating windows.
Facing these challenges, PI appears as a kind of tool box for the engineer, essentially based either on equipment (hardware) or on methods (software) leading to different technological solutions with a large variety of examples of technologies or combined technologies. Given this diversity, it becomes necessary to develop a methodology in process intensification to help the engineer to make the optimal choice according to his own application. Some first attempts begin to emerge (Prudhomme et al., 2013;Commenge and Falk, 2014). We describe in what follows the principle of the methodology that we advocate.

Process Intensification Methodology
The issue is therefore to find what could be the optimal solution in relation to the process involved. Clearly, a methodology is needed.
From our point of view, this methodology should be based on two blocks ( Fig. 1 Simulation/optimization/process design/piloting/selection the other linked to the chemical path characterization and more generally to the basic data acquisition (phase equilibria for instance). Furthermore, via correlations or modeling, it is then possible to feed a generic simulation tool, called here Data Processing Tool (DPT). This modeling step may be considered as a way to capitalize the knowledge. The deliverables of such a simulation tool can be process simulation, optimization and design and why not piloting and aid to equipment selection.

Modeling Background
The role of the modeling is to reproduce and predict the behaviour of any intensified process, according to the operating conditions. From that viewpoint, the description of the equipments must be as true to life as possible. For instance, focusing on Heat-EXchanger (HEX) reactors (Elgue et al., 2007), a realistic description based on a modular structure has been adopted (Fig. 2). This kind of structure built by the stacking of different plates allows various configurations to be described whatever the type of reactor is in terms of size, flow rate, flow configuration and material.
In that case, the reactor is assumed to be similar to a continuous reactor with heat transfer taking place through the walls. Flow modeling is therefore based on the same hypothesis as the one used for the modeling of a real continuous reactor (Villermaux, 1985;Naumann, 2002), represented by a series of Continuous Stirred Tank Reactors (CSTR). This approach is related to the experimental residence time distribution which allows flow analysis (highlighting of dead volumes, preferential passages or short-cuts). Such description makes it very easy to represent all possible flow configurations of the reactor (co-current, counter current). In fact, it implies that the behaviour of a cell depends solely on the inlet streams and on phenomena taking place inside: reaction, heat transfer, etc. Since the inlets of a given cell are generally the outlets of the preceding cell, any flow configuration may be represented by a correct discretization, i.e. the determination of the cell that precedes (Fig. 3).
Given the specific geometry of the reactor, two main parts may be distinguished. The first part is associated with the "process plate" where complex phenomena coupled with reactions occur. The second part encompasses the rest of the reactor structure, involving "utility" fluid, plate wall, etc.

A Process Insight
Thanks to modeling and simulation, it is possible to better understand and predict the process operation.
For several years in our lab (Prat et al., 2005;Benaïssa et al., 2008a, b), a reaction of oxidation has been used as a test reaction to be able to compare the performances of different continuous intensified technologies. The test reaction is the sodium thiosulfate oxidation, a fast reaction exhibiting a strong exothermicity (Lo and Cholette, 1972): The reaction scheme and kinetics model have been implemented into the DPT, subsequently it is possible to simulate the temperature and concentration profiles along any type of reactor.
For instance, Figure 4 shows the simulated temperatures and conversion profiles from the inlet (left) to the outlet (right) inside a G1 Corning Ò Advanced-Flow TM reactor.
The simulation results highlight the accuracy of modeling with respect to both temperatures and reaction yield. Such accuracy is all the more interesting in that in this application, the heat transfer and reaction aspects are strongly connected, due to the high degree of exothermicity. In the case of G1 Corning Ò Advanced-Flow TM reactor, where heat transfer performances are enhanced, a very high reaction sensitivity to cooling is observed: the yield and heat generated are directly related to the utility fluid conditions. For instance, utility temperature has to be set above 50°C to be able to reach high conversion level in the given short residence time (only one reactor plate). Such a sensitivity is described very accurately by the simulations, that allow internal temperature profiles to be estimated in an equipment where internal thermocouples cannot be implemented. DPT then offers a better understanding of the process and interesting perspectives from a safety viewpoint (detection of hotspots, of reactants accumulation, of bad mixing, etc.) particularly for applications presenting high dynamics and kinetics and for technologies where it is no easy to put sensors in.

A Way for Process Design and Management
By simulation based on the use of the DPT, it is also possible to look for the optimal design aiming at 100% conversion.
As an example, Figure 5b gives the temperatures and conversion profiles in an Alfa-Laval HEX-reactor prototype, the Open Plate Reactor (OPR) (the precursor of the Alfa-Laval ART Ò Plate Reactor - Fig. 5a). The reactor design (number of plates) has been simulated in order to ensure at the reactor outlet the goal of 100% conversion.
The simulated process temperature in Figure 5b seems to exhibit a probable hot spot occurring inside the reactor close to the inlet of reactants. Consequently, thanks to DPT, it is possible to examine now what should be the optimal operation of the reactor in order to avoid any hot spot. Figure 5c shows the result obtained by playing on R, which is the ratio of utility flow rate over process flow rate. It is evident that the larger the ratio is, better is the process temperature profile inside the reactor.
In conclusion, DPT may be considered as a process optimization tool not only for the design but also for the operation control and management.

A Way for Equipment Selection
Thanks to DPT and simulation of the temperature and conversion profiles, it is possible to compare different technologies between themselves, like in a benchmark challenge. As illustration, here again the Alfa Laval open plate prototype, made of stainless steel and PEEK, and an innovative prototype developed in our laboratory, made of Silicon Carbide (SiC) called Boostec (the name of the SME manufacturer).
With the same operating conditions involving the sodium thiosulfate oxidation (fast and strong exothermicity), Figure 6 shows the temperatures and conversion profiles along both prototypes: Figure 6a relative to the Alfa-Laval technology, Figure 6b relative to the Boostec technology. Clearly, because of the reactor material, the temperatures profiles differ radically. The Boostec reactor, thanks to the good thermal effusivity of the SiC Despènes et al., 2012), allows operating quite isothermally. It is not the objective here to go further into the comparison between both technologies, since other criteria should be obviously examined. Temperatures (process and utility) and conversion profiles along the reactor.
The aim was just to show how DPT could be used as a way for selecting equipment according to various criteria. This approach may be generalized. For instance, a collaborative project, named PROCIP and supported by the ANR (Prudhomme et al., 2013) is under progress, involving three industrial partners (Bluestar Silicones, Solvay-Rhodia, Processium) and three academic labs (Laboratoire de Génie Chimique -LGC, Laboratoire Réactions et Génie des Procédés -LRGP, Laboratoire de Génie des Procédés Catalytiques -LGPC). The aim is to develop an expert system guiding the equipment selection according to some specific criteria (process, chemicals, safety, and so on). The principle is as follows. A data-base is built and represents the software environment. This data-base consists in: the physico-chemical properties of the species (e-thermo TM , Processium); the equipment characteristics (pressure drop, residence time, heat and mass transfer performances via coefficients, viscous or solid handling, corrosion, and so on); the chemical reactions (stoechiometry, enthalpies, viscosity, density and so on). These data may be either provided by the user, or estimated/calculated by the software, or already implemented. At the end of the day, the software is expected to propose a list of appropriate technologies with a quotation dependent on the predominance of a criterion over another.

Conclusion
In spite of its attractive and promising results, the PI methodology aforementioned (Fig. 1) exhibits a major default. Indeed, this methodology imposes a considerable effort of basic data acquisition and furthermore of simulation. Probably, it could be considered as a killer for the process development, because there is a crucial requirement for time-to-market reduction, particularly in the frame of process innovation in a very competitive world.  It implicitly means that not only the process has to be intensified but also the way for intensifying it. There is nowadays a clear claim for innovative advances in experimental and methodology tools, while simulation tool remains a key issue with a continuous improvement of the solving techniques and of the computers (calculation times).

Introduction
As previously said, there is a need for intensifying the methods and tools.
Concerning with the equipment characterization, and aiming at the objective of accelerating the technology selection, some platforms are born in Europe during these last years, able to provide some equipment benchmark studies in order to guide efficiently the choice of equipment and to perform also demonstration tests for the proof of concept. The objective of all these tools is to reduce the time spent for the process development by offering services, methods or devices which allow the acquisition of the basic data required.

Equipment Qualification
Concerning the equipment qualification, benchmark studies have to be performed in view of comparing the technologies according to pressure drop, residence time distribution, heat and mass transfer performances, mixing efficiencies and so on. Some of the results arising from the benchmark studies performed in our lab are detailed in Despènes et al., 2012;Elgue et al., 2014;Raimondi et al., 2014). Temperatures (utility and process) and conversion profiles. a) Oxydation reaction in Alfa-Laval equipment; b) oxydation reaction in Boostec equipment.
In order to emphasize a result of such a methodology, a benchmark study has been performed at the MEPI in Toulouse based on an esterification enzymic reaction (Elgue et al., 2014). The lipase-catalysed esterification of oleic acid with ethanol has been studied in various equipments (from lab Eppendorf to continuous reactors at pilot scale) leading to the same conclusions: this esterification is limited by liquid-liquid mass transfer. So the operation of this model reaction can be considered as an assessment of the liquidliquid mass transfer performances of every reactor, especially in terms of generation of interfacial area. As the species and the reaction involved here are easy to carry out, this constitutes an excellent test to characterise the mass transfer performances of different devices, such as: Corning G1 glass reactor, Chart Shimtec, Nitech COBR and AM Technology Coflore. A given global residence time was applied (140 s), at a temperature of 30°C, with a lipase concentration of 10 LU/mL (Lipase activity, in Lipase Unit per mL) and a molar ratio between ethanol and oleic acid of 3. This residence time of 140 s has been chosen according to the reactors investigated (Corning and Chart). In fact, regarding the volume of these reactors, such a residence time leads to flow rates which ensure optimal performances in terms of mass transfer and consequently reaction conversion. For COBR, the optimal amplitude and frequency have also been chosen as being those leading to the maximal conversion. Figure 7 illustrates the resulting comparison of mass transfer performances between the different equipments.
The Nitech COBR appears here the most efficient. Moreover, in such a device the possibility to reach high residence times (up to 100 minutes) also goes along with its enhanced mass transfer performances.
The principle of the "microdroplet reactor" as a perfectly mixed batch one is now well admitted and is applied to many different unit operations Mignard et al., 2011), including not only mixing and separation, but also precipitation or crystallization. It allows also to catch the chemical transformation at the early stages of the reaction (a few milliseconds), for instance illustrated in Figure 8 (Raimondi and Prat, 2011).

Microfluidics: A Process Design Tool
In some cases, thanks to a high quality information via FTIR, NIR (Richard et al., 2013a) and Raman spectroscopy (Dorobantu et al., 2012) on-line analysis, it is possible to use microfluidics as a process design tool. As example, the study of a transesterification reaction of vegetable oil (sunflower) with ethanol leading to ethyl esters, followed by a NIR technique at a microscale, allowed to propose a new operating mode based on a continuous process coupling the glycerol removal and the reaction equilibrium displacement (Richard et al., 2013b).

Microfluidics: An Access to New Products
Besides, new operating windows investigated at the microscale under better controlled conditions (plug flow, heat and mass transfer) allow eventually the access to new products, which is of great industrial interest.
For instance, the work performed by (Marcati et al., 2010) including droplet generation, polymerization, and particles handling (solvent change or particles encapsulation in microchannels) has allowed to generate complex solid structures (such as onion-like structures) pictured in Figure 9, which exhibits the different sizes and shapes obtained according to the flow rate of TriPropylene Glycol DiAcrylate (TPGDA), as reactive diluent.

A CULTURAL BREAKTHROUGH
Applying the "Lab-of-future" attitude is undoubtedly a good way for the development of innovative processes and presumably new products, but we are convinced that here is not only a technical challenge, but also a question of efficiently sharing the expertise and finally a question of knowledge dissemination. There is a need for new tools aiming with transferring the knowledge and diffusing the methodology of intensification.
Based on this statement, that the barriers are also cultural, three Universities (Delft, Dortmund, Toulouse) have decided to create in 2009 the EUROPIC network (www.europiccentre.eu). What is EUROPIC?
It is: an industrial community with common motivations in the field of Process Intensification, presently composed of 20 members from various sectors: Chemicals, Petchem, Fine-Pharma, Bio-Agro, Equipment and Engineering Services; Concentration fields of reactant a) and by-product b) in a liquid-liquid reaction at 8 ms in a microchannel (Raimondi and Prat, 2011).
an industry-driven platform for knowledge and technology transfer in PI; a club-like consortium based on the dissemination of highquality information: updated data-base, accreditated experts, industrial experts meetings, training courses on PI, and so on.

CONCLUSION
Process innovation thanks to PI is challenging and requires to develop breakthrough tools. The methodology proposed here intends to combine HTE microfluidic-based devices at the lab scale with simulation tools (DPT), for process designing, scaling-up and optimization. The following step is then the proof-of-concept thanks to demonstration platforms at the pilot scale (MEPI) in order to accelerate the knowledge transfer and the dissemination of success stories through networks (EUROPIC).