We demonstrate how quantum machine learning might play a vital role in achieving moderate speedups in machine learning problems and might have scope for providing rich models to describe the distribution underlying the observed data. We work with Restricted Boltzmann Machines to demonstrate the same to supervised learning tasks. We compare the relative performance of contrastive divergence with sampling from Dwave annealer on bars and stripes dataset and then on imabalanced network security data set. Later we do training using Quantum Imaginary Time Evolution, that is well suited for the Noisy Intermediate-Scale Quantum era to perform classification on MNIST data set.