Solved

Deleting Data from Data Management

  • 1 July 2020
  • 3 replies
  • 112 views

Userlevel 2
Badge

Yesterday we had usage data loaded into a custom object twice which has caused inflated usage. I read the article that indicates I can delete data from Data Management if I apply a filter that only shows the data I need to delete but I’m running into an issue where I can filter to the DATE but not to the exact TIME of the duplicate data entries. 
 

Has anyone been able to work around this? 

icon

Best answer by jean.nairon 11 July 2020, 00:07

For the record, you’re not a true admin until you duplicate some data. Lol. We’ve all done it and definitely been there. 

That is correct, you can’t filter to time in Data Management or Data Operations. You can only filter to the date. This works well if you have 1 record created on 7/9 and another created on 7/10. But it doesn’t work if you have them on 7/9 9:00am and 7/9 10:00 am. 

In Data Operations, you can sort by the date/time field though. If you don’t have too many records to delete, it might be quicker to sort and just select a page of records at a time.

 

If you have a lot of records to delete, I have used rules in the past as well to update a boolean or text field on the record. And then used that for the filter in Data Operations to delete by. In the rule query, you won’t be able to filter for the Time either but you can filter for the date and only show the Max of Created date. Your rule would look something like this:

 

That can add a bit of time though to build the rule and ensure your only deleting the bad data. If you have just 2 copies of the same data set loaded, it may be quicker to delete all records and just load it again. 

 

View original

3 replies

Userlevel 6
Badge +1

@Robert_DeLaO I’ve done this myself (and I bet most Gainsight Admins have).

I don’t have an easy solution, but I was able to fall back on some other field/value that was also helpful in identifying duplicates. For example, if a different User happened to create the duplicate usage data, you can add that as a 2nd filter, then delete.

When that route isn’t available, I’ve had to create a new field on the Object in question, use the Rules Engine to populate that field for any duplicates (perhaps populating a 1 into a newly-created field, after querying for the MAX value in the DATE field or the CREATED DATE fields for your Usage Records, if the most recently-added data is the duplicate), then go back to Data Management and filter on my new field, and finally delete. It’s a two-step, but might get you there.

 

Userlevel 2
Badge

Thanks. I’ll give this a try. :grinning:

Userlevel 7
Badge

For the record, you’re not a true admin until you duplicate some data. Lol. We’ve all done it and definitely been there. 

That is correct, you can’t filter to time in Data Management or Data Operations. You can only filter to the date. This works well if you have 1 record created on 7/9 and another created on 7/10. But it doesn’t work if you have them on 7/9 9:00am and 7/9 10:00 am. 

In Data Operations, you can sort by the date/time field though. If you don’t have too many records to delete, it might be quicker to sort and just select a page of records at a time.

 

If you have a lot of records to delete, I have used rules in the past as well to update a boolean or text field on the record. And then used that for the filter in Data Operations to delete by. In the rule query, you won’t be able to filter for the Time either but you can filter for the date and only show the Max of Created date. Your rule would look something like this:

 

That can add a bit of time though to build the rule and ensure your only deleting the bad data. If you have just 2 copies of the same data set loaded, it may be quicker to delete all records and just load it again. 

 

Reply