As far as i am aware, FullStory makes an initial call home with details on the page being recorded which is used to crawl your website and build up a cached version of your website/application.
From then all it does it record everything happening on the page, this can be anything from scrolling to clicking a button which is bundled up and periodically sent home for processing.
When it comes to playing back a session all it does is play back each event step by step in that cached version it created on the initial step.
As far as recording other things like console, it just overrides the globals with a custom function which is passed back to the real one allowing both custom and native functionalities.
A young computer scientist from the top of India, the beautiful place called Jammu and Kashmir. I love making stuff, things that change the way the world operates, things that affect the space-time.
Yes, they appear to store cached versions of your assets so they stay relevant to the version that user was using during the session.
As far as the crawler goes, i imagine it makes a request to your website and pulls out all known assets (html, css, javascript and images) and perhaps follows links in order to get assets from all of the pages across the website/application.
It appears to know the difference between local and cdn assets and doesn't cache cdn links or media files like audio and video.
Recording works via listeners, all the events supported are initialized on the page load and each time one fires, it is built up into a queue and subsequently send back to fullstory.
When this bundle is sent appears to be based on the size of the queue and when it was last sent, the algorithm also supports exponential back-off and will bail if it was unable to send the payload.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
As far as i am aware, FullStory makes an initial call home with details on the page being recorded which is used to crawl your website and build up a cached version of your website/application.
From then all it does it record everything happening on the page, this can be anything from scrolling to clicking a button which is bundled up and periodically sent home for processing.
When it comes to playing back a session all it does is play back each event step by step in that cached version it created on the initial step.
As far as recording other things like console, it just overrides the globals with a custom function which is passed back to the real one allowing both custom and native functionalities.
Thanks for the detailed answer. Can you elaborate on how, in your opinion, "record" and "crawl" part would look like when implemented?
Do they convert global assets to local ones and store on their own? Because assets might change over time.
Yes, they appear to store cached versions of your assets so they stay relevant to the version that user was using during the session.
As far as the crawler goes, i imagine it makes a request to your website and pulls out all known assets (html, css, javascript and images) and perhaps follows links in order to get assets from all of the pages across the website/application.
It appears to know the difference between local and cdn assets and doesn't cache cdn links or media files like audio and video.
Recording works via listeners, all the events supported are initialized on the page load and each time one fires, it is built up into a queue and subsequently send back to fullstory.
When this bundle is sent appears to be based on the size of the queue and when it was last sent, the algorithm also supports exponential back-off and will bail if it was unable to send the payload.