Stambia S19 - How to user SQL_STAT_* to populate a following treatment table ?
V
Virginie_Samson
started a topic
almost 2 years ago
Hello
We have différents Mapping that feds différents tables.
We have some centralize table that follow all tables feedings.
Table Name, Launch Chain Name, Nb Insert, Nb Update, Nb Delete, Alim Date, Alim Hour, etc....
We found some complicated Rhino to do so, but we d love to found an easier way.
We are one Stambia S19.
Any good idea please?
Thanks
Best Answer
R
Romain ALCARAZ
said
almost 2 years ago
Hi Virginie,
If your needs is to register statistics for each execution of your mappings : You'll have to create a common process that will insert statistics in your centralized stats table.
This process will simply use some input variables in SQL query.
Then you will be able to use this common process on your existing process just after your mapping step.
You will have to specify the good values on that common process. Stambia will put this values on the SQL query and will execute the good insert statement on your centralized stats table.
This is an example of common process :
Here, I use a table named STATS_TABLE with some fields (TABLE_NAME, CHAIN_NAME, NB_INSERT, NB_UPDATE, NB_DELETE).
On the existing process, this is the way you will use the common process (by drag and drop) :
You see differents usages to put the good values of your statistics :
- The first one is the most simple use case that will perform sum of SQL_STATS. You just have to specify the name of the step that you want to get statistics. This solution will use light rhino script (very light).
There is a limit of that solution if your mapping perform a LOAD step, your statistics could be wrong because the number of inserted rows will be higher.
- The second solution is to specify the complete path of your mapping insert (or update) step. So, Stambia will get the real statistics. But this solution is a little more complicated because writing the complete path of steps could take time.
Remember you can add every information that you need to register on your centralized stats table (date, session_id, etc...). You just have to change SQL query on the common process.
If you want to register previous sessions statistics, you'll have to read log database. I suggest looking at this following tables :
- stb_log_session_sess to get session informations
- stb_log_session_stat_sst to get all session statistics
- stb_log_action_stat_ast to get all session step statistics
- stb_log_action_s_act to get session step informations
You want to register the statistics from previous sessions ? Or are you looking for a way to log statistics each time a mapping is executed ?
V
Virginie_Samson
said
almost 2 years ago
Hello
It can be both.
For the project it is to log statistic in a last Process for all the Mappings and Process that just finish before.
R
Romain ALCARAZ
said
almost 2 years ago
Answer
Hi Virginie,
If your needs is to register statistics for each execution of your mappings : You'll have to create a common process that will insert statistics in your centralized stats table.
This process will simply use some input variables in SQL query.
Then you will be able to use this common process on your existing process just after your mapping step.
You will have to specify the good values on that common process. Stambia will put this values on the SQL query and will execute the good insert statement on your centralized stats table.
This is an example of common process :
Here, I use a table named STATS_TABLE with some fields (TABLE_NAME, CHAIN_NAME, NB_INSERT, NB_UPDATE, NB_DELETE).
On the existing process, this is the way you will use the common process (by drag and drop) :
You see differents usages to put the good values of your statistics :
- The first one is the most simple use case that will perform sum of SQL_STATS. You just have to specify the name of the step that you want to get statistics. This solution will use light rhino script (very light).
There is a limit of that solution if your mapping perform a LOAD step, your statistics could be wrong because the number of inserted rows will be higher.
- The second solution is to specify the complete path of your mapping insert (or update) step. So, Stambia will get the real statistics. But this solution is a little more complicated because writing the complete path of steps could take time.
Remember you can add every information that you need to register on your centralized stats table (date, session_id, etc...). You just have to change SQL query on the common process.
If you want to register previous sessions statistics, you'll have to read log database. I suggest looking at this following tables :
- stb_log_session_sess to get session informations
- stb_log_session_stat_sst to get all session statistics
- stb_log_action_stat_ast to get all session step statistics
- stb_log_action_s_act to get session step informations
V
Virginie_Samson
said
almost 2 years ago
Hello
Thanks for this information. We used part of it.
As we have numbers of Mapping, the ideau would me to automate as much as possible. Not make it step by step
R
Romain ALCARAZ
said
almost 2 years ago
Ok,
So you want to use a final common process (on your existing process) that will get statistics ?
I suggest to use the log database with a mapping (with somes log tables in source and centralized stats table in target) and session_id as filter.
You will get every stats you want, with the good step to get statistics (stb_log_action_stat_ast and stb_log_action_s_act tables).
V
Virginie_Samson
said
almost 2 years ago
Hello Good idea, do you have a Physical Model of those tables, with the key join ?
R
Romain ALCARAZ
said
almost 2 years ago
Hello Virginie,
This is the two tables usage :
The first table (stb_log_action_stat_ast) contains every step (act_id) executed on a session (sess_id). Each of the lines contains volume statistics (type of statistic acp_short_name and the value ast_stat_sum).
If you want to get the name of the step, you'll need to use the second table (stb_log_action_s_act) which contains name of the step (act_short_name), and the path (act_parent_path).
I suggest doing several tests to validate that you are using the correct values. It all depends on what you want to retrieve in your table that centralizes the statistics.
1 person likes this
V
Virginie_Samson
said
almost 2 years ago
Hello
That can be a good idea.
How do you catch the sess_id in the Process that end its execution ?
V
Virginie_Samson
said
almost 2 years ago
Hello
That could be a good way.
A few questions are coming :
When is the Repository database updated ? While the process is executing or after ?
Where could we get the execution timestamp ?
Do you have a complete modele of all link when can use on this database
Thanks for your help.
For information, each day, we can this data in our DWH
R
Romain ALCARAZ
said
over 1 year ago
Hi Virginie,
Sorry for the delay.
If you want to get statistics during an execution of a process (for instance to get statistics for several mappings or step already executed) you have to use my first method described in this post (with the path of statistics steps) : This statistics is stored after the steps have ended but the sessions can still be running. However, if you want to get statistics for ended sessions, you can use database log.
The table stb_log_action_act contains start (act_begin_date) and end (act_end_date) timestamp for every steps.
About the database log model, I sent you an email with specific information. It's important to know that our teams are working on different articles including one that concerns this subject.
If you encounter problems or bugs on our Semarchy software, do not hesitate to report them to our support.
V
Virginie_Samson
said
over 1 year ago
Hello
Any news about the new article to come?
S
Stéphanie FOURRIER
said
over 1 year ago
Hi Virginie,
are you talking about the article to help upgrade from earlier versions to the new versions ? This is in progress, currently being reviewed internally.
Virginie_Samson
Hello
We have différents Mapping that feds différents tables.
We have some centralize table that follow all tables feedings.
Table Name, Launch Chain Name, Nb Insert, Nb Update, Nb Delete, Alim Date, Alim Hour, etc....
We found some complicated Rhino to do so, but we d love to found an easier way.
We are one Stambia S19.
Any good idea please?
Thanks
Hi Virginie,
If your needs is to register statistics for each execution of your mappings : You'll have to create a common process that will insert statistics in your centralized stats table.
This process will simply use some input variables in SQL query.
Then you will be able to use this common process on your existing process just after your mapping step.
You will have to specify the good values on that common process. Stambia will put this values on the SQL query and will execute the good insert statement on your centralized stats table.
This is an example of common process :
Here, I use a table named STATS_TABLE with some fields (TABLE_NAME, CHAIN_NAME, NB_INSERT, NB_UPDATE, NB_DELETE).
On the existing process, this is the way you will use the common process (by drag and drop) :
You see differents usages to put the good values of your statistics :
- The first one is the most simple use case that will perform sum of SQL_STATS. You just have to specify the name of the step that you want to get statistics. This solution will use light rhino script (very light).
There is a limit of that solution if your mapping perform a LOAD step, your statistics could be wrong because the number of inserted rows will be higher.
- The second solution is to specify the complete path of your mapping insert (or update) step. So, Stambia will get the real statistics. But this solution is a little more complicated because writing the complete path of steps could take time.
Remember you can add every information that you need to register on your centralized stats table (date, session_id, etc...). You just have to change SQL query on the common process.
If you want to register previous sessions statistics, you'll have to read log database. I suggest looking at this following tables :
- stb_log_session_sess to get session informations
- stb_log_session_stat_sst to get all session statistics
- stb_log_action_stat_ast to get all session step statistics
- stb_log_action_s_act to get session step informations
- Oldest First
- Popular
- Newest First
Sorted by Oldest FirstRomain ALCARAZ
Hello Virginie,
You want to register the statistics from previous sessions ? Or are you looking for a way to log statistics each time a mapping is executed ?
Virginie_Samson
Hello
It can be both.
For the project it is to log statistic in a last Process for all the Mappings and Process that just finish before.
Romain ALCARAZ
Hi Virginie,
If your needs is to register statistics for each execution of your mappings : You'll have to create a common process that will insert statistics in your centralized stats table.
This process will simply use some input variables in SQL query.
Then you will be able to use this common process on your existing process just after your mapping step.
You will have to specify the good values on that common process. Stambia will put this values on the SQL query and will execute the good insert statement on your centralized stats table.
This is an example of common process :
Here, I use a table named STATS_TABLE with some fields (TABLE_NAME, CHAIN_NAME, NB_INSERT, NB_UPDATE, NB_DELETE).
On the existing process, this is the way you will use the common process (by drag and drop) :
You see differents usages to put the good values of your statistics :
- The first one is the most simple use case that will perform sum of SQL_STATS. You just have to specify the name of the step that you want to get statistics. This solution will use light rhino script (very light).
There is a limit of that solution if your mapping perform a LOAD step, your statistics could be wrong because the number of inserted rows will be higher.
- The second solution is to specify the complete path of your mapping insert (or update) step. So, Stambia will get the real statistics. But this solution is a little more complicated because writing the complete path of steps could take time.
Remember you can add every information that you need to register on your centralized stats table (date, session_id, etc...). You just have to change SQL query on the common process.
If you want to register previous sessions statistics, you'll have to read log database. I suggest looking at this following tables :
- stb_log_session_sess to get session informations
- stb_log_session_stat_sst to get all session statistics
- stb_log_action_stat_ast to get all session step statistics
- stb_log_action_s_act to get session step informations
Virginie_Samson
Hello
Thanks for this information. We used part of it.
As we have numbers of Mapping, the ideau would me to automate as much as possible. Not make it step by step
Romain ALCARAZ
Ok,
So you want to use a final common process (on your existing process) that will get statistics ?
I suggest to use the log database with a mapping (with somes log tables in source and centralized stats table in target) and session_id as filter.
You will get every stats you want, with the good step to get statistics (stb_log_action_stat_ast and stb_log_action_s_act tables).
Virginie_Samson
Hello
Good idea, do you have a Physical Model of those tables, with the key join ?
Romain ALCARAZ
Hello Virginie,
This is the two tables usage :
The first table (stb_log_action_stat_ast) contains every step (act_id) executed on a session (sess_id). Each of the lines contains volume statistics (type of statistic acp_short_name and the value ast_stat_sum).
If you want to get the name of the step, you'll need to use the second table (stb_log_action_s_act) which contains name of the step (act_short_name), and the path (act_parent_path).
I suggest doing several tests to validate that you are using the correct values. It all depends on what you want to retrieve in your table that centralizes the statistics.
1 person likes this
Virginie_Samson
Hello
That can be a good idea.
How do you catch the sess_id in the Process that end its execution ?
Virginie_Samson
Hello
That could be a good way.
A few questions are coming :
Romain ALCARAZ
Hi Virginie,
Sorry for the delay.
- If you want to get statistics during an execution of a process (for instance to get statistics for several mappings or step already executed) you have to use my first method described in this post (with the path of statistics steps) : This statistics is stored after the steps have ended but the sessions can still be running. However, if you want to get statistics for ended sessions, you can use database log.
- The table stb_log_action_act contains start (act_begin_date) and end (act_end_date) timestamp for every steps.
- About the database log model, I sent you an email with specific information. It's important to know that our teams are working on different articles including one that concerns this subject.
If you encounter problems or bugs on our Semarchy software, do not hesitate to report them to our support.Virginie_Samson
Hello
Any news about the new article to come?
Stéphanie FOURRIER
Hi Virginie,
are you talking about the article to help upgrade from earlier versions to the new versions ? This is in progress, currently being reviewed internally.
Stéphanie.
-
File external resolver with encrypted value
-
Timeout on Mapping/Webservice call
-
UNIQUE TEMP TABLE NAMES
-
Previous step name from a sub-process of execute delivery
-
Dropping multiple tables at once in XDI
-
Freeze Designer and corrupted workspace
-
FileWait/Move/Copy: FileInclude with space
-
Loop excel files
-
Change SVN user in the Designer
-
How to Specify the JVM used by the Designer?
See all 64 topics