Build off-line versions of event and session tables
GENERAL
TESTING
GENERAL
TESTING
Description
Currently these spool off into space for all perpetutity - most folks clean these out from time to time to keep the tables from getting too big. What we need to do is capture and store this data efficiently for report generation and querying.
It might be nice to come up with a way to automate a process where these get moved from the "live" tables to the offline tables every 30 minutes or so. The structure of the offline tables could actually be slightly different than the online tables and tuned to make queries run really fast. Ultimately the queries for the SiteStats tool could union between the (now much smaller) online tables and the offline tables to get the data so the users would ever see the "30 minute lag" - but things would perform nicely.
Then, over time, we could take the off line tables and store them by semester or something like that.
CLE team bulk inactive issues cleanup - if this is important then reopen or create a new issue
Aaron Zeckoski August 28, 2010 at 1:07 PM
There was a bit of discussion about this and the basic agreement is that there is not agreement on the right way to handle this stuff. Since everyone does agree we don't have time to implement multiple solutions I would propose we close this one out as we are planning to handle this with documentation and some warnings generated by the code. If it stagnates until the next MT issue pass (i.e. if you are on the MT and doing a JIRA sweep and this is the last comment) then go ahead and close it out.
David Horwitz February 23, 2010 at 6:32 AM
MAINT TEAM REVIEW: This feature request is currently unassigned and will be reviewed. In line with stated Jira practice http://confluence.sakaiproject.org/display/MGT/Sakai+Jira+Guidelines Feature requests that are not going to be implemented will be closed with a status of "wont fix". If you intend implementing this issue please ensure that its up to date and assigned correctly
Currently these spool off into space for all perpetutity - most folks clean these out from time to time to keep the tables from getting too big. What we need to do is capture and store this data efficiently for report generation and querying.
It might be nice to come up with a way to automate a process where these get moved from the "live" tables to the offline tables every 30 minutes or so. The structure of the offline tables could actually be slightly different than the online tables and tuned to make queries run really fast. Ultimately the queries for the SiteStats tool could union between the (now much smaller) online tables and the offline tables to get the data so the users would ever see the "30 minute lag" - but things would perform nicely.
Then, over time, we could take the off line tables and store them by semester or something like that.