Each time we re-analyzed the page, we got more data and more URLs for the crawler to grab and have waiting for us the next time we analyzed that particular page. Of course, session-dependent content would get badly messed up, but that generally doesn't make useful (or at least repeatable) search results anyway.
Source: I primarily did JavaScript execution for Google's indexing pipeline 2006 to 2010.