Issue
When investigating performance problems in xDM, it is often difficult to determine:
  • What exact SQL query is being executed.
  • What parameter values are being bound to the query.
  • Why a screen, stepper, or action is slow.
  • Whether the issue originates from the database layer.

Without visibility into the executed SQL, it is impossible to properly analyze execution plans or identify optimization opportunities.

Cause

xDM executes database queries through the JDBC layer.


By default:

  • SQL statements are not fully logged.
  • Prepared statement parameters (?) are not visible.
  • Only high-level application logs are available.


As a result:

  • The actual runnable SQL query cannot be directly extracted.
  • Performance root causes remain unclear.
  • Database-level optimization (indexes, statistics, joins) cannot be accurately evaluated.


Solution

To capture and analyze executed SQL queries, you must temporarily enable TRACE-level logging for the JDBC components.


Step 1 - Activate required loggers

Modify the logging configuration and enable:

<Root level="TRACE">
<Logger level="TRACE" name="org.springframework.jdbc.core.JdbcTemplate"/>
<Logger level="TRACE" name="org.springframework.jdbc.core.StatementCreatorUtils"/>


What these loggers provide:

  • JdbcTemplate: logs executed SQL statements.
  • StatementCreatorUtils: logs parameter values bound to the prepared statements.

Together, they allow reconstruction of the complete runnable SQL query.


Step 2 - Follow safe logging practices

/!\ TRACE logging is extremely verbose and can heavily impact performance.


It is strongly recommended to:

  • Perform the test in a low (non-production) environment.
  • Ensure minimal concurrent activity.
  • Prepare the UI use case before activating logging.
  • Clear the PDE.
  • Activate the loggers.
  • Reproduce the exact use case.
  • Immediately revert logging to default levels.
  • Extract the PDE logs.


Step 3 - Handle log continuation

If the extracted log begins with:

This is a continuation of log file .../work/Catalina/localhost/semarchy/eclipse/workspace/.metadata/.bak_xxx.log

This indicates log rotation.


In that case:

  • Extract the entire .metadata folder.
  • The PDE files may overwrite previous entries.
  • Consolidate all related log files before analysis.


Step 4 - Reconstruct the query

In the logs, you will see:

  • SQL with placeholders:
SELECT * FROM MY_TABLE WHERE ID = ?
  • Followed by parameter values:
Setting SQL statement parameter value: column index 1, value [12345]


To obtain a runnable query:

  • Replace each "?" with its corresponding value.
  • Ensure proper quoting for strings and dates.


Example:

SELECT * FROM MY_TABLE WHERE ID = 12345;

You now have a fully executable SQL statement.


Step 5 - Analyze in the Database

Execute the reconstructed query directly in the database and run an explain plan on it.

Here is an article detailing how to produce an explain plan for analysis.


This helps identify:

  • Full table scans
  • Missing indexes
  • Expensive joins
  • Nested loop inefficiencies
  • Cardinality misestimation
  • Outdated statistics
  • Non-optimized join conditions
  • ...

/!\ This step is technical and may require the assistance of a Database Administrator (DBA), especially in production-like environments.


Step 6 - Apply Optimizations

Based on the explain plan, possible improvements may include:

  • Creating new indexes
  • Recomputing statistics
  • Optimizing join conditions
  • Reducing unnecessary filters
  • Reviewing model design
  • Eliminating function-based filtering on indexed columns
  • Multiple factors may contribute to the performance issue.
  • ...

/!\ Index creation, statistics management, and execution plan interpretation are advanced database tasks and should ideally be performed or validated by a DBA to avoid unintended performance impacts.