Auto Compare Data being loaded to an Object
None
I don't normally post ideas. But, I think this one would greatly benefit everyone, especially GS.
I have been doing a lot of work in IBM's environment optimizing their rules. One way we are optimizing them is doing a fetch of old data and fetch of new data. We then merge and compare the two. If any values don't align we push the record to upsert.
It would save a lot of time if we can have this auto done.
It would essentially only be available when loading to a GS object or SFDC object and is an update or upsert. But, essentially when we build that action we would/could have a check box. When checked, the rule would automatically on the back end fetch the fields that are being pushed to and then do a comparison of data points based on the mappings in the action. The merge of those datasets would be based on the identifiers marked. If any of the fields are not the same then push the record for updating.
Not only does this benefit people/GS Admins from having to build this part out. It will save loads of time on rule run times across all customers as pointless records would no longer need to be actioned on, stream lining the load process. Especially since building it out manually is not really a concept many people get and requires extra effort when they can just push everything.
Feel free to reach out if you have any questions, need further details, or an example of how this is done manually.
I have been doing a lot of work in IBM's environment optimizing their rules. One way we are optimizing them is doing a fetch of old data and fetch of new data. We then merge and compare the two. If any values don't align we push the record to upsert.
It would save a lot of time if we can have this auto done.
It would essentially only be available when loading to a GS object or SFDC object and is an update or upsert. But, essentially when we build that action we would/could have a check box. When checked, the rule would automatically on the back end fetch the fields that are being pushed to and then do a comparison of data points based on the mappings in the action. The merge of those datasets would be based on the identifiers marked. If any of the fields are not the same then push the record for updating.
Not only does this benefit people/GS Admins from having to build this part out. It will save loads of time on rule run times across all customers as pointless records would no longer need to be actioned on, stream lining the load process. Especially since building it out manually is not really a concept many people get and requires extra effort when they can just push everything.
Feel free to reach out if you have any questions, need further details, or an example of how this is done manually.
Sign up
If you ever had a profile with us, there's no need to create another one.
Don't worry if your email address has since changed, or you can't remember your login, just let us know at community@gainsight.com and we'll help you get started from where you left.
Else, please continue with the registration below.
Welcome to the Gainsight Community
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
We didn't even have a last modified field due to it being legacy. But, even with that they typically update all of their records anyhow.
As you can see in the below screenshot this takes up 20-30 mins, some days more.
We then changed the rule to have this design.
As you can see in the screenshot below it improved this rule to only take up 5-10 mins cutting back 2/3rds of our rule run time.
And the comparison is done in a transformation task.