Virtual reality cloud-based gaming (VRCG) services are becoming widely available on virtual reality (VR) devices delivered over computer networks.VRCG brings users worldwide an extensive catalog of games to play anywhere and anytime. Delivering these gaming services in existing broadband mobile networks is challenging due to their stochastic nature and the user perceived Quality of Experience (QoE)' sensitivity towards them. More research is needed regarding developing effective methods to measure the impact of network QoS factors on users' QoE in the VRCG context. Therefore, this paper proposes, develops, and validates three novel regression models trained on a real dataset collected via subjective tests (N=30); the dataset contains subjective users' QoE ratings regarding VR shooters affected by network conditions (N=28), such as round-trip time (RTT), random jitter (RJ), and packet loss (PL). Our findings reveal that due to the nonlinear relationship of (RTT and RJ) tested together, nonlinear(mean absolute error (MAE)=0.14) and polynomial (MAE=0.15) regression models have the best performance; yet, simple linear regression model(MAE=0.19) is also suitable to predict QoE for VRCG. Further, we found that the model's most important feature is RTT, followed by (RTT, RJ). Finally, our models' prediction of QoE for real-world traffic measurements suggests that mobile network traffic (4G, 5G non-standalone, 5G standalone) provides a 2.5 \(\leq MOS\_{QoE} \leq\) 3.0 experience for VRCG, while 4.2 \(\leqMOS\_{QoE} \leq\) 4.4 for wired connections, suggesting the need for improvements in the current commercial 5G network deployments to deliverVRCG.