As a champion for process modeling, Dr. Michael Call helps Lubrizol create value from process simulation tools, including batch modeling. I had the chance to talk to Michael about his experiences using process simulation for batch processes, as well as what excites him about how far process simulation for batch processes has come and where he hopes it will go in the future.
What do you do, and how did you get started?
My role is to foster and spread the use of process modeling to improve Lubrizol’s bottom line. What I do, in essence, is whatever it takes to achieve that. The most important part of the job is helping the engineers utilize the software tools that we have. This encompasses training, support and mentoring. My work helps empower engineers so they can recognize the opportunities to use process modeling and have the competence and confidence to exploit those opportunities.
My interest in process simulation was sparked as an undergraduate. I was doing a project on non-ideal mixing models, and I found that I really enjoyed using the physics of a process to explain how the process would behave. During my PhD work I continued in the area of non-ideal mixing in chemical reactors.
When I graduated, my first position was in a process modeling group. My first experience in batch modeling was with lab-scale batch polymerization in that position. Back then, I was doing much of my process modeling the old-fashioned way, programming in Fortran.
What excites you about batch simulation?
I would say two things. First is the challenge of it. In my opinion, batch process modeling is much more challenging than steady-state process simulation. It is inherently more complex.
Second is the value it delivers. Most of Lubrizol’s components and products are made using batch processes. By using process modeling to better understand the processes, I can (and do) improve these processes, creating substantial value for the organization.
What are the most important things you’ve learned about batch simulation over your career?
Check your assumptions. A model should be good enough to give valid answers to the questions that you have, but should not be any better or more complex than is needed. Thus, the proper assumptions about the physics are driven by what you need to do with the model.
You should be mindful of the assumptions (both explicit and implicit) you make when developing a model. If you find that your model does not behave the way you expect, or does not match the actual process, then you should re-examine the assumptions that you made. My doctoral thesis advisor, Robert Kadlec at the University of Michigan, instilled in me the importance of checking your assumptions and checking your results. This has served me well throughout my career.
Pay attention to the controllers. If you are creating a dynamic model, you should pay attention to the controllers. Oftentimes you will have controllers in the process model that are not configured or tuned the same way as the actual process. This can sometimes cause the dynamic response of the model to be quite different than that of the actual process. If you get the controllers (and the other physics) right, your model will show the proper dynamic response. This is not as obvious as having the right geometry or the right physical properties, but it still can be very important.
Use process simulation from the ground up. I recently worked on a project where we used the process modeling tool from the conceptual stages through startup. We were on tight timelines. This practice allowed us to save significant time as we communicated with detailed design contractors and other stakeholders. I will cover this in more detail in an upcoming webinar.
What excites you about the future of batch simulation?
Less coding, more physics. In the past, I found myself focusing on coding rather than the problem at hand. With more commercialization of batch models, I can input the geometry and start working through the problem sooner. In addition, these models are becoming easier to use and increasingly more robust.
Expanded scope for process models. More (and more detailed) batch models for unit operations (such as reaction, distillation, crystallization, etc.) are being developed allowing more equipment and more processes to be analyzed. I can look at processes with multiple pieces of equipment to find the global optimum, rather than only looking at individual units.
Batch and continuous integration. The integration of batch and continuous modeling helps organizations drive tremendous value. This integration gives us the capability to optimize the full process by matching cycle times and optimizing tradeoffs that are not visible when looking at the batch or continuous parts separately.
If you would like to hear more about Mike Call’s experiences with batch process modeling, sign up for our upcoming webinar.