Data Managment - NULL Constraint/Duplicate Records

Related products: None

Based on the current data issues, here are some proposals to introduce while Ingesting the data.










NULL values : There is no Check constraint on NULL values while mapping. Can we Introduce "NOT NULL" constraint to reject the records?  













Duplicate records on Key fields: Can we reject the duplicate key field records? 










Current situation of handling duplicate records during ingestion process is quite uncertain. This might not help customer expected record to show up.










Execution Logs: Post load log shows only number of successful/failure records. Is it possible to show up number of records actually got inserted or updated instead just successful?










Introducing these features will help customers to proactively identify data issues post load and can take decisions before messing up with the data.



Hi Roja,





Thanks for your inputs!





Not Null constraint is being planned for the spring release, wherein a user can define a NOT NULL constraint at a field level, which will restrict the ingestion of null values into a field.





Unique Keys: Unique keys feature is also being planned for spring release, where in user can define a field or combination of fields as unique keys, which will restrict the user from ingesting duplicate records into a field





Execution logs: will look into the feasibility 





Thanks and Regards,


Lakshmi
Thanks Lakshmi for the updates and plan for these items in product!!