In SS3.0.x I'm trying to batch delete from a large dataset. 16K records linked to Files and other DataObjects, all told with relations its prbably about 250K. To delete the records I basically have a CSV, of 10000 entries that I need to compare against.
I figured the easiest way is loop through my DataObject (16K table), and if its not in the CSV delete it and all its related records. Right now I have php memory set at 512MB, and process timeout at 10min and its still choking even if I batch check 500 records at a time.
I'm wondeirng if I should just temporarily max out my PHP enviroment and try to get through it, or if there is a way I could optimize the actual process to get through it more effeciently.