<p dir="ltr">Introduction</p><p dir="ltr">For my Q-step placement, I worked at TheCentre for Food Policy, which is aninterdisciplinary centre that aims to improvefood policy and supply worldwide. The Centretries to do this by generating evidence thatsupports the development of an effective foodsystem and by engaging with the policy-makers(The Centre for Food Policy, 2025).</p><p></p><p dir="ltr">The aim of the task I was given was to analyse adataset regarding food packaging provided byValpak (2020) for the financial year 2017/2018regarding 20 different food product categories.This set of data included the material and thetype of packaging, as well as the averagepackaging weight, the category of the productand its market tonnage.</p><p><br></p><p dir="ltr">MethodologyI initially made pivot tables using Excel tofocus on specific types of food andpackaging. Food weight indicating, forexample, 100-200g was rounded to 3/4s(175). This was done as some products hadportions weighting, for example, 100g,100-200g, 150g and 200g. Rounding up by¾ seemed the most effective way tohighlight differences since the categoryvariable “100-200g” could not be analysedotherwise. The average of the averagepackaging weight was divided by theweight of food contained to calculate theamount of packaging per gram of food.This identifies which food pack-sizes hadthe most amount of packaging per foodgram. This was done for different foodtypes and specific packaging types andmaterials, and displayed them on a linegraph using Datawrapper (2013) for aneasier visualization.</p><p><br></p><p dir="ltr">I then wanted to find therelationship between theamount of packaging per foodgram on food portions/amountsthat were not recorded on theoriginal Valpak data. To do thisI needed a line of best fit, andthe most optimal type wasfound by creating graphs inExcel until they met the criteria.The best type of line of best fitwas found to be the powertrendline since its R2 was theclosest to 1, as shown in Figure3. By using formulas generatedin Excel I found the line of bestfit for each graph, allowing meto add it on each of the originalgraphs on Datawrapper for thefinal result.</p><p dir="ltr">Findings and Conclusion:</p><p dir="ltr">Figure 1: For meat products specifically, whichwere all grouped together in the same graph dueto the ones in higher demand all coming in plastictrays: Trays containing the smallest portions ofbeef products contain more than a fifth of a gramof packaging per gram of food, which is anexorbitant amount, especially when comparing itto the other 3 products on the graph andconsidering how the numbers decreases as thebeef portions become larger, which could lead tothe idea that the excessive amount of packaging iscompletely unnecessary, and possibly just used forphysical appearance, to make it look morefashionable and resistant to possible damage onthe way home, hence attracting more customers.The other 3 products, being bacon, chicken andsausages, while starting relatively high comparedto other products on the market as it can be seenon other graphs, they are all consistent and sharesimilar patterns, suggesting that could be thenormality in terms of packaging weight for theseproducts when in plastic trays. The only outlier outof these 3 is bacon which, while sharing a similarpattern with the other 2, it can be seen peaking at200, 275 and 700 grams, showing an abnormaluse of packaging for those particular portions. </p><p dir="ltr">Figure 2: For all cheese categories, including all types of packaging: Except for the cheese blocks generally using less packaging due to their packaging nature (but still being on a similar trend with the other 2), cheese categories show a similar pattern being at their higher in the smaller portions, starting at around 0.065 and 0.07 grams of packaging, decreasing, having a small peak between 200-250g and going back down.</p><p dir="ltr">Challenges</p><p dir="ltr">When trying to export the Dataset on SPSS, a range of issues would occur which wouldn’t let me continue. Because of this, I had to use Excel, which was a whole new challenge by itself because of the extra task of learning to use it in the first place and becoming proficient with it.</p><p></p>