Shutterstock
Solids Processing: All Models Are Wrong but Useful

Solids Processing: All Models Are Wrong but Useful

Feb. 5, 2024
In the future, artificial intelligence may develop unbiased models for intricate processes.

Several challenges arise in solids processing, primarily due to the difficulty in pinpointing the root cause amidst numerous possibilities. Achieving a specific particle size distribution (PSD) poses a considerable challenge when the product undergoes multiple stages of equipment, such as filtration, drying and agglomeration. The complexity intensifies when customers introduce new PSD specifications. You just can’t change a temperature in the crystallizer like you could for a distillation column.  A lot of work goes into making this change and defining the new process route. Mere adjustments in the crystallizer conditions are insufficient to achieve the desired PSD; the change must endure through solid/liquid separation, potential drying processes, and even considerations for conveying and packaging.

Would AI be able to explain the correlations we used?

We rely on models to facilitate changes in the crystallizer or reactor, leveraging available data on yield, selectivity and physical properties. However, due to a lack of complete understanding of the process, we resort to using correlations instead of fundamental principles. Correlations are confined to the range of data utilized in their generation. Despite the stochastic nature of nucleation mechanics, we often simplify it as a continuum. The process involves multiple iterations of forming new surfaces, followed by dissolution, ultimately resulting in the creation of new crystals. Growth adds further complexity, generating new nuclei as new surfaces emerge, with the growth rate influenced by particle size and other physical properties. Numerous parameters contribute to the overall growth process, but it's uncommon to have all of them at our disposal. Despite the current buzz surrounding artificial intelligence (AI), our designs continue to be grounded in correlations derived from laboratory-generated data until AI can develop unbiased models for such intricate processes.

For example, we made a product in a draft-tube crystallizer that used a fines destruction loop to control PSD. Changing the PSD was fairly straightforward. Luckily, our chemists could painlessly evaluate different conditions and parameters. They did a series of small-scale test batches with various super-saturation levels to assess a variety of key parameters to get answers to some crucial questions:

  • Average particle size. What are the mean size and overall PSD?
  • Growth time. How long did it take to get to the above size?
  • Yield. How much solute did the isolated product contain?
  • Product shape. What did the crystals look like?
  • Filterability. How easily did the crystals filter?

These tests showed little change in the product while altering the particle size. However, the various super-saturation levels provided enough information to modify the PSD in the draft-tube crystallizer.  At first, we thought this would be enough to estimate the new cut size for the fines destruction loop based on growth rates. 

This was not the case, as we didn’t have enough time in the loop to remove enough of the fines.  We then conducted more comprehensive tests in a supplier’s pilot plant to determine the cut size for the fines destruction loop.  The problem was a lack of understanding of the dissolution process.  Particles took longer to dissolve than to form using the correlation of crystal size from the lab tests.  A few years later, we had an opportunity to examine the crystallization and dissolution process in an atomic-force microscope, which gave us insight into that process.  Had that technology been available, we could have skipped a lot of lab work to get a realistic model.  Would AI have been an option?

Agglomeration is an area where models would be very valuable.  We had a product that was made in a forced-circulation crystallizer, but it would take an inordinate amount of time to grow the particle to the desired size.  We decided to make the product in a fluid bed dryer using the fine product that was made in the crystallizer.  Our experience with a disperser that mixed fine particles with a little water in another application gave us some basic correlations. These used particle size of the agglomerate versus horsepower to predict performance.  While this correlation got us into the correct ballpark, it took many hours to refine the design.  The final empirical model used several parameters such as rotational speed, diameter of the wheel, ratio of water/solids and disperser area to estimate the PSD.  I ask myself, “Would AI be able to predict performance or explain the correlations we used?” 

I prefer employing fundamentals over correlations when constructing models for processes. Nonetheless, as my boss used to quip, "All models are wrong, but sometimes useful." Despite this, he indulged my inclination for modeling software. Perhaps, as our understanding of AI advances, we may discover more effective models in the future.

About the Author

Tom Blackwood, Solids Advice columnist | Contributing Editor

Tom Blackwood, a veteran engineer who has dealt extensively with solids over the course of his career, contributes regularly to Chemical Processing and serves as the Solid Advice columnist.