Hello Vinoth,
That is a good point!
And so far there is no major change regarding certification process on SDP.
As you mentioned, the data is considered as a whole entering steps all together.
And the recommendation still remains to not handle the same entities in different queues to not create blockers or data discrepancies.
If you want to discuss in more depth about your use case and illustrate us what you had in mind so that we can document this to the product team, please let me know and I will convert this topic in a ticket.
Thank you for your interest in SDP and the future of Semarchy.
We are looking forward to our collaboration.
Have a good day.
Vinoth KUMAR C
Hi All,
I am reaching out to see if there are any architectural changes made to data certification process in SDP.
In xDM, every record part of the load goes through each of the certification tasks before published to GD tables, so we will have to wait till the load completes to see any record part of the load to be available for consumption. Load is considered as a single unit rather than each of the records in the load treated independently with multiple threads executing them separately to fasten the process and data availability.
For a given entity parallelization of loads also little tricky if I were to run multiple loads one through batch file and another through APIs for the same entity, both these are running as sequential loads and one has to wait for another.
I understand submitting to multiple queues for the same entity could cause data inconsistencies. This one of the major challenges in customer having integrations through multiple channels.
Would be really helpful if you could share some light on this whether it is still a limitation in SDP too.
Appreciate your inputs. Thanks!