This blog post was inspired by my younger brother who turned 17-years-old over the weekend. It’s family tradition that I make his birthday cake—although we’ve branched out from cakes into other deserts the last couple of years—and it was in the middle of making a banana and caramel cheesecake that I remembered I still had to write this blog post. This got me thinking. Is there a way to apply big data tools to baking?
Apparently, many corporate bakeries are data driven. I mean why not? As we know, data can improve efficiency which increases profits something any business is interested in. The example I’m going to analyse is from the company blog of Factora—an organization that plans, designs, and implements “Manufacturing Execution Systems (MES) and Manufacturing Operations Management (MOM) solutions.” In the post, author Michael Chang, uses the story of a bakery to demonstrate the evolution of smart manufacturing with the introduction of big data.
The baking company—which was never named in the post—hired Factora to eliminate a “waste issue.” (On the production line, a certain type of cake was rising too high, too often.) From my research, I learned variability this is a common issue in food production. Products that are too small or too large cannot be packaged and sold, thus, wasting company resources. Chang claims a variability problem is “one which smart manufacturing was born to remedy.” This does not mean, however, that creating the solution was easy. As Chang explains:
“For those of you with no experience in food production, imagine vats of flour, hundreds of gallons of water flowing through industrial hoses, bags of sugar – in all, 300 meters of mixing and baking machine. Is it the percentage of flour? The heat of the oven? The amount of water? The size of the eggs? Or a combination?
And did we mention that every line baker, over time, has developed their own way of managing the settings and producing the cakes? That the whole process is regarded as a type of expert black magic, known only to a skillful handful?”
Factora’s solution had two steps: One, data scientists used trend analysis—the practice of collecting information and attempting to spot a pattern and then using historical results to predict future outcomes—and test batches to develop “a new set of norms for production.” Two, Factora mounted overhead electronic displays to alert line bakers if any given KSF (solubility product constant) was out of spec. Or in other words, when a batch needed adjustment.
What I found interesting about this blog was that it explores the changes to the smart manufacturing industry with the introduction of big data tools. Chang explains that if Factora was to tackle the cake problem today, they would use an algorithm to optimize the cake baking process. “Rather than a data expert developing a set of static test criteria from a data set, the predictions would be machine-generated, becoming ever more sophisticated over time, backed by ever more data.” Chang says, the program Factora uses to generate algorithms—Analytics, by ThingWorx—can take years of data and create an algorithm that offers 70 percent accuracy. (Which seems low to me, but is apparently more than acceptable in the industry.)
Essentially, even baking can create a data set to be dumped into a program and used to produce an algorithm. I do, however, see a difference between these hypothetical baking algorithms and those labeled Weapons of Math Destruction by Cathy O’Neil. For one, these baking algorithms can receive feedback and be adjusted accordingly. It’s very easy to track which cake batches succeed or fail and compare them with the predictions of the algorithm. These baking algorithms are also transparent because bakeries need to know exactly what goes into a successful batch.
Honestly, I was pleasantly surprised by this blog post. Who knew there was a useful and non-harmful way to use big data to improve baking? That being said, I wonder if this only applies to corporate bakeries or if mom-and-pop shops could benefit as well.