Yesterday we had usage data loaded into a custom object twice which has caused inflated usage. I read the article that indicates I can delete data from Data Management if I apply a filter that only shows the data I need to delete but I’m running into an issue where I can filter to the DATE but not to the exact TIME of the duplicate data entries.
Has anyone been able to work around this?
Best answer by jean.nairon
For the record, you’re not a true admin until you duplicate some data. Lol. We’ve all done it and definitely been there.
That is correct, you can’t filter to time in Data Management or Data Operations. You can only filter to the date. This works well if you have 1 record created on 7/9 and another created on 7/10. But it doesn’t work if you have them on 7/9 9:00am and 7/9 10:00 am.
In Data Operations, you can sort by the date/time field though. If you don’t have too many records to delete, it might be quicker to sort and just select a page of records at a time.
If you have a lot of records to delete, I have used rules in the past as well to update a boolean or text field on the record. And then used that for the filter in Data Operations to delete by. In the rule query, you won’t be able to filter for the Time either but you can filter for the date and only show the Max of Created date. Your rule would look something like this:
That can add a bit of time though to build the rule and ensure your only deleting the bad data. If you have just 2 copies of the same data set loaded, it may be quicker to delete all records and just load it again.