How to optimize high CPU utilization in the database?
B
Bharat Joshi
started a topic
over 1 year ago
job stuck in a request involving DL_BATCH
All integration batches are stuck, and RDS CPU utilization in reaching 100%.
Pasting TOP SQL: -- Job: CREATE_ACCOUNTS_1
-- Task /Validate Source Authoring Data/AddressMapping/Change ErrorStatus of reintegrated SD records from ERROR to OBSOLETE_ERROR
All integration batches are stuck, and RDS CPU utilization in reaching 100%.
Pasting TOP SQL:
-- Job: CREATE_ACCOUNTS_1
-- Task /Validate Source Authoring Data/AddressMapping/Change ErrorStatus of reintegrated SD records from ERROR to OBSOLETE_ERROR
update SA_MDM_ADRS_MAP as T
set
B_ERROR_STATUS = 'OBSOLETE_ERROR'
from (
select
PREV_SA.B_LOADID, PREV_SA.ADRS_MAP_ID
from SA_MDM_ADRS_MAP PREV_SA
inner join DL_BATCH PREV_B
on (
PREV_SA.B_LOADID = PREV_B.B_LOADID
and PREV_B.B_BATCHID is not null and PREV_B.B_BATCHID < 432004
)
inner join SA_MDM_ADRS_MAP CUR_SA
on (<span class="fr-marker" data-id="0" data-type="false" style="display: none; line-height: 0;"></span><span class="fr-marker" data-id="0" data-type="true" style="display: none; line-height: 0;"></span>
PREV_SA.ADRS_MAP_ID = CUR_SA.ADRS_MAP_ID
and CUR_SA.B_LOADID = 434050
)
where
PREV_SA.B_ERROR_STATUS is not null and PREV_SA.B_ERROR_STATUS = 'ERROR'
) as S
where
S.B_LOADID = T.B_LOADID
and S.ADRS_MAP_ID = T.ADRS_MAP_ID
Best Answer
B
Bharat Joshi
said
over 1 year ago
run full vacuum in dl_batch table : -
explanation : there are good chances that the explain plan of the query does not correspond to the current actual records in the tables, which makes it inefficient. Running full vacuum generates a better explain plan
1 Comment
B
Bharat Joshi
said
over 1 year ago
Answer
run full vacuum in dl_batch table : -
explanation : there are good chances that the explain plan of the query does not correspond to the current actual records in the tables, which makes it inefficient. Running full vacuum generates a better explain plan
Bharat Joshi
job stuck in a request involving DL_BATCH
-- Job: CREATE_ACCOUNTS_1
run full vacuum in dl_batch table : -
explanation : there are good chances that the explain plan of the query does not correspond to the current actual records in the tables, which makes it inefficient. Running full vacuum generates a better explain plan
Bharat Joshi
run full vacuum in dl_batch table : -
explanation : there are good chances that the explain plan of the query does not correspond to the current actual records in the tables, which makes it inefficient. Running full vacuum generates a better explain plan
-
Import Data Into Entities via Azure Data Lake
-
Recover Deleted(soft Delete) Record and Configure in Application
-
Data Quality in batch mode and real-time integration
-
Integration with analytics tools
-
Query/Load/Delete data with the REST API
-
Does the Done Tab in Inbox have a limit?
-
How Can I Trigger Enricher or Sql Procedure when deleting?
-
Matching Rules But Only The Latest Record Creates a Golden Record
-
Unstructured and Semi Structured Data in Semarchy?
-
Read CSV files from AWS S3
See all 72 topics