A Modular Deep Learning Pipeline (CNN+U-Net+GAN) for Color-Accurate, Cross-Material Digital Textile Printing with Transfer-Learning-Based Material Adaptation

Hua Yan, Huizhi Jiang

Abstract


Precise color reproduction and efficient pattern generation are the core goals of digital printing on clothing. To break through the limitations of traditional processes that rely on manual parameter adjustment and sample fabric trial and error, this paper proposes an intelligent printing generation framework based on deep learning. This framework integrates CNN color management, deep segmentation and loop optimization, GAN-driven 3D virtual rendering and transfer learning material adaptation, and can achieve end-to-end pattern generation and computational optimization on multi-material data such as cotton fabric, silk and polyester. The system not only captures the spatial detail features of the patterns (such as edge sharpness and color gradation), but also maintains color consistency and detail restoration among different materials through cross-domain modeling. The experimental results show that on multi-material datasets, this scheme achieves ΔE 1.9±0.2across cotton/silk/polyester (mean over 3 runs), which corresponds to a 30–45% reduction versus screen printing (ΔE≈4.1) and 15–25% versus a commercial inkjet baseline (ΔE≈2.3). It reduces splicing fracture rate to <4%, shortens average processing time by ~60% (12 h→4.8–8.5 h depending on batch size), and increases SSIM to 0.93±0.01.All statistics are mean±std over three independent runs; significance is assessed with paired t-tests or ANOVA with Bonferroni correction at α=0.05. This research not only verified the effectiveness of deep learning in digital printing, but also provided an expandable intelligent path for the integration of the clothing design and production chain, offering significant support for the transformation of the fashion industry towards personalization, greenness and intelligence.


Full Text:

PDF

References


Casciani D, Chkanikova O, Pal R. Exploring the nature of digital transformation in the fashion industry: opportunities for supply chains, business models, and sustainabilityorientedinnovations[J].Sustainability,2022,18(1):773-795.https://doi.org/10.1080/15487733.2022.2125640.

Gill S. Evolving pattern practice, from traditional to digital parameterisation for customised apparel [J]. International Journal of Clothing Science and Technology, 2024, 36(2):150-165.https://doi.org/10.1080/17543266.2023.2260829.

Glogar M. Digital technologies in the sustainable design and production of fashion [J]. Sustainability, 2025, 17(4): 1371. https://doi.org/10.3390/su17041371.

Butturi M A. PreConsumer Waste Recovery and Circular Patterns [J]. Environments, 2025, 12(3):82.https://doi.org/10.3390/environments12030082.

Li S. Review on development and application of 4Dprinting technologies in smart textiles [J]. Smart Materials and Structures, 2023, 32(11):113001.https://doi.org/10.1088/1361-665X/acefbb.

Gazzola P, Grechi D, Iliashenko I, Pezzetti R. The evolution of digitainability in the fashion industry: a bibliometric analysis [J]. Kybernetes, 2024, 53(13): 101–126. https://doi.org/10.1108/K-05-2024-1385.

Gao C, Xu F, Liu Y. Study on the quality and inkjet printing effect of the prepared washing-free disperse dye ink [J]. RSC Advances, 2023, 13(15): 9782–9790. https://doi.org/10.1039/D3RA01597A.

Choi K H. 3D dynamic fashion design development using digital virtual simulation systems [J]. Fashion and Textiles, 2022, 9(1): 23.https://doi.org/10.1186/s40691-021-00286-1.

Tkalec M. The complexity of colour/textile interaction in digital printed textiles [J]. Arts, 2024,13(1):29.https://doi.org/10.3390/arts13010029.

Walker EB. Color accuracy and durability for printed, branded textiles: a comparison across sublimation, DTG, and screen printing [J]. Journal of Imaging Science and Technology, 2024,68(1):18-27.https://doi.org/10.2352/CIC.2024.32.1.18.

Baek E. Defining digital fashion: Reshaping the field via a systematic literature review [J]. Fashion and Textiles, 2022, 9(1): 1–23. https://doi.org/10.1016/j.chb.2022.107407.

Sayem A S M. Digital fashion innovations for the real world and metaverse [J]. International Journal of Clothing Science and Technology, 2022, 34(2): 150–165. https://doi.org/10.1080/17543266.2022.2071139.

Duan Y. Exploring the law of color presentation of doublesided heterochromatic digital printing [J]. Frontiers in Psychology, 2022,13:956748.https://doi.org/10.3389/fpsyg.2022.956748.

Habib A, Ullah A, Maha MM, et al. Advancing sustainable fashion through 3D virtual design for reduced environmental impact [J]. Journal of Textile Engineering & Fashion Technology, 2025, 11(3): 135–142. https://doi.org/10.15406/jteft.2025.11.00415.

Pietroni N, Dumery C, Guenot-Falque R, Liu M, Vidal-Calleja T, Sorkine-Hornung O. Computational pattern making from 3D garment models [J]. Computer-Aided Design, 2022,144:103028.https://doi.org/10.1016/j.cad.2021.103028.

Li Y, Zhang J, Xu H. Study of color reproduction in pigment digital printing [J]. Textile Research Journal, 2023, 93(11-12): 2179-2193.https://doi.org/10.1177/00405175221147725.

Zhu W, Huang Y, Shen L. A method of enhancing silk digital printing color prediction based on Pix2Pix framework [J]. Applied Sciences, 2023, 14(1): 11. https://doi.org/10.3390/app14010011.

Wu X. An application of generative deep learning models in textile design [J]. Textile Design Journal, 2024, 15(2): 115–128. https://doi.org/10.1080/14606925.2024.2303236.

Glogar M, Naglić M, Petrak S. Sustainable pre-treatment of cellulose knitwear in digital pigment printing processes [J]. International Journal of Clothing Science and Technology, 2024,37(4):679-693.https://doi.org/10.1108/IJCST-03-2024-0061.

Zhao H, Li Y, Wang C. Insights into coloration enhancement of mercerized cotton fabric on reactive dye digital inkjet printing [J]. RSC Advances, 2022, 12(17): 10386–10395.https://doi.org/10.1039/D2RA01053D.

.




DOI: https://doi.org/10.31449/inf.v49i14.11235

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.