Abstract:
With the shift in trend towards the data centric AI, It is indispensable to provide proper labeled data to our DL models. What we have built is a prototype of a data handler tool or pipeline which gives an a high-level insight on the pool of unlabeled data, post which classifies noisy and clear data with a particular type of noise, now the noisy images are segregated separately and we use Deep learning models to denoise these images and then again feed them back to the remaining clear images. Here the type of noise is subject to the user and the dataset.. This is developed with modularity in mind and can be scaled to other types of noises as well