When creating a 3D scene, the artistic style determines what atmosphere the environment brings to the viewer, and the style of the scene is contributed by the style of the models within. The texture of a 3D model has an important role in forming the overall artistic style of the model, so transferring the style of the texture is a way to modify the atmosphere of the scene that contains the models. Recent image style transfer algorithms utilizing neural networks are capable of transferring the artistic style from one image to another, but their designs do not take the property of textures of 3D models into account. This leads to some artifacts that degenerate the visual quality of the model when the stylized texture is applied back to it. Therefore, this thesis project focuses on designing a neural network architecture dedicated to transferring the artistic style of 3D model textures to eliminate the artifacts.