Real user monitoring (RUM) is a passive monitoring technology that records all user interaction with a website or client interacting with a server or cloud-based application.[1] Monitoring actual user interaction with a website or an application is important to operators to determine if users are being served quickly and without errors and, if not, which part of a business process is failing.[2] Software as a service (SaaS) and application service providers (ASP) use RUM to monitor and manage service quality delivered to their clients. Real user monitoring data is used to determine the actual service-level quality delivered to end-users and to detect errors or slowdowns on websites.[3] The data may also be used to determine if changes that are propagated to sites have the intended effect or cause errors.

Organizations typically use RUM to test changes within the production environment or to anticipate behavioral changes in a website or application by using A/B testing or other techniques. As technology shifts more and more to hybrid environments like cloud, fat clients, widgets, and apps, it becomes more and more important to monitor the usage of applications from within the client itself.

Real user monitoring is typically "passive monitoring" i.e., the RUM device collects web traffic without having any effect on the operation of the site. In most cases, a form of JavaScript is injected into the page or native code within the application to provide feedback from the browser or client. This data is collected from various individuals and consolidated.[4]

RUM can be very helpful in identifying and troubleshooting last-mile issues. RUM differs from synthetic monitoring in that it relies on actual people clicking on the page to take measurements rather than automated tests simply going over a given set of test steps.

References

  1. Altvater, Alexandra (2020-01-29). "What Is Real User Monitoring? How It Works, Examples, Best Practices, and More". Stackify. Retrieved 2022-08-13.
  2. "Real user monitoring (RUM)". Dynatrace. Retrieved September 20, 2021.
  3. "USER EXPERIENCE MONITORING". UTP. 12 June 2014. Retrieved 5 November 2014.
  4. Oyama, Katsunori; Takeuchi, Atsushi; Ming, Hua; Chang, Carl K. (December 2011). "A Concept Lattice for Recognition of User Problems in Real User Monitoring". 2011 18th Asia-Pacific Software Engineering Conference. Ho Chi Minh, Vietnam: IEEE. pp. 163–170. doi:10.1109/APSEC.2011.32. ISBN 978-1-4577-2199-1. S2CID 7779708.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.