Warp deformation is a common error encountered in additive manufacturing. It is typically caused by residual internal stresses in the manufactured part that arise as material cools. These errors are challenging to prevent or correct as they build over time and thus are only visible long after the actions that caused them. As a result, existing work in extrusion additive manufacturing has attempted warp detection but not correction or prevention. We report a hybrid approach combining deep learning, computer vision, and expert heuristics to correct or prevent warp. We train a deep convolutional neural network using diverse labelled images to recognise warp in real-time. We compute five metrics from detection candidates to predict the severity of warp deformation and proportionately update print settings. This enables the first demonstration of automated warp detection and correction both during printing and for future prints.
In line with Matta's grant mission to build AI for manufacturing the impossible, the possible must be manufacture well at first (and this can still be done with AI). We worked with the Institute for Manufacturing in the Department of Engineering at the University of Cambridge to explore how AI can help recognise and correct warp deformation errors in the 3D printing process - hopefully demonstrating the potential that machine learning can bring to traditional manufacturing and control problems which remain unsolved.
In a paper published in Additive Manufacturing, we report a low cost and scalable method to augment any thermoplastic extrusion 3D printer with state-of-the-art object detection neural networks capable of detecting warp in real time. A single-stage deep convolution neural network was trained to both detect and localise warp features with high accuracy in unseen images, and to provide a confidence level for its predictions. The network was trained on a labelled dataset of warping examples from a wide range of part geometries in a variety of colours, collected using a custom data collection pipeline. Unlike existing approaches, the method presented extracted a suite of metrics from detections in images to estimate the severity of warp deformation present. The metrics extracted from images by the deep neural network were then combined with expert informed heuristics to both slow the growth of warp during printing, limiting its severity, and to reduce or even eliminate the occurrence of warp in future prints through the automated correction of printing and slicing parameters.
Materials & Methods
What is warp and how did we collect the data
Typically warp deformation occurs in AM due to the generated residual thermal stresses, caused by the non-uniform cooling rate after material deposition. With each newly deposited layer, these stresses increase and can lead to part distortion, especially when the adhesion between the part and build plate is insufficient to overcome the residual stresses’ induced pulling force. As such, warp is a continuous process building over time with every deposited layer. Normally, warping starts from the corners and results in the part peeling away from the build plate.
A network of 8 Creality CR-20 Pro FFF 3D printers was used for data collection. To collect images from the printer network, a custom pipeline was developed to create the first diverse warping dataset containing labelled images of warp features for a wide range of geometries printed in different materials and colours.
In total 74 parts were printed with 10,154 images collected and automatically timestamped and stored on the server. Of this, 1,414 images contained warping features for the model to learn from. Bounding box labels were drawn on the images to encompass both the warp defect and the shadow/reflection underneath. In total, the final labelled dataset contained 1,976 bounding box samples across the 1,414 images which contain warp.
AI model for locating warp and estimating severity
Object detection was used to detect and localise instances of warp in captured images of the printing process. The single-stage detection network, YOLOv3, was chosen due to its combined fast detection speed and high accuracy. We applied multiple commonly used data augmentation methods to the bounding box labelled images in the dataset to reduce overfitting. We trained network (pre-trained on COCO) on our dataset and achieved a mAP of 88.72%.
To successfully correct warping defects in 3D printed parts it is important to not only know when warp is present during a print, but to also know its severity. As such, methods were developed to estimate the severity of warp both during a print and upon its completion. Bed temperature is known to strongly correlate with the prevalence and scale of warp, with lower temperatures resulting in greater warping. Therefore, multiple bed temperature levels were used to produce parts with different scales of warp in a controlled manner to determine suitable metrics. This warp severity calibration is a one time process that can be applied to future unseen prints.
A simple rectangular cuboid geometry was designed and sliced for printing at different bed temperatures. In total, 20 samples of this geometry were printed out of grey ABS, 5 for each of the following bed temperatures: 70℃, 80℃, 90℃, and 100℃. Approximately 60 images were acquired throughout the course of each print. Five distinct metrics were created to extract greater information from the images fed through our trained detection model. These metrics are: the area of the predicted bounding boxes, the number of detections, the confidence of the predictions, estimated volume of the warped region within the predicted bounding box (volume of pixels under the mean brightness of the box), and aspect ratio (width over height) of the predicted bounding box. The above metrics were calculated for all the images collected during the 20 cuboid prints, plus the same prints with a horizontal flip applied all images (providing 40 examples in total — 10 for each bed temperature), to determine whether these metrics can differentiate between multiple levels of warping severity.
The cumulative sum of these metrics was tracked throughout the duration of the prints, with the mean and standard deviation of each bed temperature shown in Fig. 1B. The sums diverge as the print progresses demonstrating the effectiveness of each metric. Furthermore, the 90°C and 100°C which are barely distinguishable in Fig. 1A are also similar in our metrics. However, the cumulative sum is affected by the length of the print and number of images. This can be resolved by using the cumulative mean, dividing each metric at a point in time by the number of images received — thus making the metric values robust to different print lengths (see Fig. 1C). With the cumulative mean, much like the sum, the levels in warp severity as seen in Fig. 1A are clearly distinguishable using the developed metrics with 70°C and 80°C having higher metric values corresponding to greater warp and 90°C and 100°C following similar paths.

Results
Autonomous warp reduction and correction
The trained object detection models and developed metrics were applied via two different methods. The first is capable of reducing further warping during a print through intelligent parameter intervention once a set threshold of warp is detected. The second method can dramatically reduce and often eliminate warp from future prints by adjusting a wider range of both printing and slicing parameters.
A real-time automated feedback pipeline was developed to detect warp and analyse the severity during printing, this opened the door for in situ mitigation of further warp during the remainder of the print. This online detection and correction utilised a workstation equipped with two Nvidia Quadro RTX 5000 GPUs, an i9-9900K CPU (8 cores and 16 threads), and 64GB of RAM. Only a single GPU is required for the correction.
Snapshots are captured every 30s at a resolution of 1280x720 pixels during the course of printing using an off-the-shelf USB camera. These images are sent over the network to a server where they are reshaped and converted to tensors for inference with our selected YOLOv3 model This model subsequently proposes a list of warp candidates in an image, providing a bounding box for localisation along with a confidence estimate For each detection of warp in the image, the aforementioned metrics: bounding box area, number of detections, confidence, warp volume, and aspect ratio are calculated and stored in separate lists. A running cumulative mean of these metrics is determined from these lists which is updated after receiving each new image. The latest value of the cumulative mean (this being the overall mean until that point in the printing) is continuously compared to set thresholds for each metric. These thresholds are the maximum value of the cumulative mean for each metric in the 90°C and 100°C bed temperature examples which show minimal warp The thresholds are shown as dashed lines in Fig. 2 (step 3). If any of the metrics exceeds its threshold value, a command is sent to the printer to update printing parameters. As warp is caused by the build up internal residual stresses during cooling after material deposition, slowing down the rate and amount of cooling reduces warp. Thus, by increasing the bed temperature, turning off part cooling fans, and slowing down the printing speed it is possible to save otherwise severely warping prints. Specifically, in the samples shown in B, the bed temperature was increased to the printers max value of 100°C, the fan is turned off, and print speed reduced to 75%. It was found that to successfully reduce warp during printing, after its initial formation, required significant interventions — especially as heating the bed takes time. These interventions were determined through experimentation.
Once it has been determined that an intervention is beneficial using the metric thresholds, commands are sent over the network to the Raspberry Pi connected to the printer. The Pi subsequently generates the correct G-code instructions and sends them via serial interface to the printer’s control board. Examples of the results of this automated intervention process can be seen in Fig. 2. Here all 4 samples were printed using the exact same model and settings. It is clear that warping is significantly reduced for the 2 samples where the intervention was applied.

Although this method performs well it cannot eliminate warp once it has occurred only reduce future warping. As such, another automated feedback pipeline was developed to eliminate warp from future prints.
Much like the system described above, images of the printing process are captured every 30s and sent over the network to a server for inference. However, instead of updating settings during the print, this method applies updates to future prints upon the analysis of images from an entire print. Fig. 3 presents a schematic for the system as a whole. Every image during a print is analysed by the trained YOLOv3 model to detect instances of warp. Using the candidates proposed by the model, the standard 5 metrics: bounding box area, number of detections, confidence, warp volume, and aspect ratio are calculated. Subsequently, the mean for these metrics is taken across the entire print, with the value of zero used when no detections are present. These are then normalised using the data obtained from the calibration cuboid prints discussed previously. The normalised metric values are averaged using the mean to create a combined metric which is in turn used to categorise the warp into one of 3 severity levels: minor, moderate, and major. Although warping, the relevant corrective parameters, and traditional control processes operate in the continuous domain, a discretised approach to correction was taken due to the discrete data obtained from the 4 categories of calibration prints, and the unknown continuous relationships between parameters and warping. Additionally, this binning approach is more robust to variation and noise in both the detection of warp and the estimation of its severity. Depending the severity level determined for the print, a suite of updates are applied to the slicing and printing parameters for the next print to reduce to the chance of warping.
For minor, the cooling fan is turned off and the bed temperature is increased by 10%. More interventions are applied for the moderate warp level, with the cooling fan turned off, bed temperature increased by 20%, infill density reduced by 20%, and Z offset is lowered by 0.04mm. Finally, when major warp is predicted, a wide range of updates are applied, with the cooling fan turned off, bed temperature increased by 30%, infill density reduced by 40%, Z offset is lowered by 0.08mm, a brim of 5 lines added, and hotend temperature for the first layer increased by 5°C. These parameters and the amounts were selected by expert operators as important factors in the development of warp deformation and tuned with experimentation. Set limits, given the printers specification, mitigate harmful overcorrection by checking if the newly corrected temperatures and Z offset are achievable, with values past the boundary clamped.
Multiple experiments were conducted for a range of geometries to demonstrate the efficacy of the warp correction system. Two different models of surgical guides and a tensile test specimen were printed using ABS with sensible print and slicing settings. All of the parts printed with these initial settings resulted in various levels of warp as seen in Fig. 3. Each print was subsequently reprinted with settings automatically updated by the system presented. Both surgical guides show a dramatic warp reduction resulting in functional devices, and the tensile test specimen shows no signs of deformation. In Fig. 3, two ABS prints of gears are presented where severe warping led to detachment from the print bed and complete failure. These gears were printed using the same settings as the previous uncorrected samples but with an 80°C bed temperature. The developed methodology correctly detected warp and automatically updated parameters enabling the reprinted gears to be manufactured with no errors. These five experiments demonstrate that the system can generalise to completely unseen geometries in different colours. Due the diverse training dataset and relative correction approach, the authors believe the system can even be applied to unseen materials and printers.

Conclusion
In this work automated detection and correction of warp deformation errors is demonstrated, comprising a sophisticated methodology for detecting warp, estimating its severity, and subsequently updating appropriate slicing and printing parameters to reduce the occurrence of warp in future prints. This process is also demonstrated to dramatically slow and reduce the severity of warp in situ during a print — enabling otherwise unusable parts to still be functional. This system is not only useful for reducing discard rate and wasted material, energy, and time but also for optimising certain parameters to their maximum without resulting in warp. For example requiring a high infill density for a mechanically strong part tends to result in warp, with this system users can autonomously find the max infill density before warp occurs.
Further Reading
- Read the paper
Authors
Douglas A. J. Brion
Sebastian W. Pattinson
Acknowledgements
This work was been funded by the Engineering and Physical Sciences Research Council, UK PhD. Studentship EP/N509620/1 to D.A.J.B., Royal Society award RGS/R2/192433 to S.W.P., Academy of Medical Sciences award SBF005/1014 to S.W.P., Engineering and Physical Sciences Research Council award EP/V062123/1 to S.W.P. and an Isaac Newton Trust award to S.W.P.