<p dir="ltr">This thesis first presents the damped block Newton (dBN) method for solving diffusion and diffusion-reaction problems in one dimension. The dBN is an iterative solver for the shallow Ritz approximation to the solution of the partial differential equation (PDE). It optimizes the Ritz functional by alternating between the linear and non-linear parameters of the shallow ReLU neural network (NN). The linear parameters are updated by exact inversion of the mass and stiffness matrices, while the non-linear parameters are updated by one modified Newton iteration applied to a reduced system, all at a cost of O(n), where n is the number of neurons within the NN. A hallmark of the method is using geometric information to reduce the system in order to ensure invertibility of the Hessian within the Newton iteration. It is shown numerically that dBN performs well for problems where finite element methods (FEMs) struggle, such as those with interfaces and singularly perturbed reaction- diffusion problems. The efficiency of the dBN method is further improved by introducing adaptivity (denoted as adaptive dBN or AdBN).</p><p dir="ltr">The geometric meaning of the linear parameters continues to be fruitful in applications to error indicators. The use of the linear parameter as an error indicator, denoted as the C-indicator, is thus the second focus of this thesis. In particular it is shown theoretically that, for Ritz formulations of PDEs using shallow ReLU NNs, the linear parameter corresponds to the edge jump around the current break point (1D) or break line (2D). Thus the linear parameter is proposed as a cheap error indicator for these particular applications. It is shown numerically that the C-indicator is comparable in accuracy to the ZZ-indicator within adaptive neuron enhancement (ANE) methods for selected one- and two-dimensional problems.</p>