Im re working an existing PHP/MySql/JS/Ajax web app that processes a LARGE number of table rows for users. Here’s how the page works currently.
- A user uploads a LARGE csv file. The test one I’m working with has 400,000 rows, (each row has 5 columns).
- Php creates a brand new table for this data and inserts the hundreds of thousands of rows.
- The page then sorts / processes / displays this data back to the user in a useful way. Processing includes searching, sorting by date and other rows and re displaying them without a huge load time (thats where the JS/Ajax comes in).
My question is should this app be placing the data into a new table for each upload or into one large table with an id for each file? I think the origional developer was adding seperate tables for speed purposes. Speed is very important for this.
Is there a faster way? Is there a better mouse trap? Has anyone ever delt with this?
Remember every .csv can contain hundreds of thousands of rows and hundreds of .csv files can be uploaded daily. Though they can be deleted about 24 hrs after they were last used (Im thinking cron job any opinions?)
Thank you all!
A few notes based on comments:
- All data is unique to each user and changes so the user wont be Re accessing this data after a couple of hours. Only if they accidentally close the window and then come right back would they really re visit for the same .csv.
- No Foreign keys required all csv’s are private to each user and dont need to be cross referenced.