
International Journal on Science and Technology
E-ISSN: 2229-7677
•
Impact Factor: 9.88
A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal
Plagiarism is checked by the leading plagiarism checker
Call for Paper
Volume 16 Issue 2
2025
Indexing Partners



















DEEP LEARNING – POWERED GRAYSCALE TO COLOR CONVERSION USING OPEN CV
Author(s) | Y.Bhanu Prakash Reddy, Y.V.N.Prem Kumar, U.Sri Hari, K.Saranya |
---|---|
Country | India |
Abstract | Colorization of grayscale images is a challenging and fascinating problem in computer vision. This project leverages deep learning techniques, specifically convolutional neural networks (CNNs), to automatically convert grayscale images into colorized versions. Utilizing OpenCV and pre-trained deep learning models such as DeOldify or the Colorful Image Colorization model, we achieve high-quality image colorization with minimal user intervention. The model learns color distributions from large datasets and intelligently predicts realistic colors for grayscale images. This approach finds applications in restoring old photographs, enhancing medical imaging, and improving visual aesthetics in various fields. |
Keywords | Grayscale Images Converting to Color,Image Colorization,Machine learning. |
Field | Computer > Artificial Intelligence / Simulation / Virtual Reality |
Published In | Volume 16, Issue 1, January-March 2025 |
Published On | 2025-03-22 |
Cite This | DEEP LEARNING – POWERED GRAYSCALE TO COLOR CONVERSION USING OPEN CV - Y.Bhanu Prakash Reddy, Y.V.N.Prem Kumar, U.Sri Hari, K.Saranya - IJSAT Volume 16, Issue 1, January-March 2025. DOI 10.71097/IJSAT.v16.i1.2674 |
DOI | https://doi.org/10.71097/IJSAT.v16.i1.2674 |
Short DOI | https://doi.org/g892fm |
Share this


CrossRef DOI is assigned to each research paper published in our journal.
IJSAT DOI prefix is
10.71097/IJSAT
Downloads
All research papers published on this website are licensed under Creative Commons Attribution-ShareAlike 4.0 International License, and all rights belong to their respective authors/researchers.
