Performance Considerations and Kinetis Particularities

Kinetis local memory controller has some particularities that affect the debugger performance. Besides the common performance degradation given by the latency of reading the cache and updating the cache view, there are some specific performance issues that must be taken into account when performing various operations with the debugger.

When accessing a memory location, the Kinetis platform uses the same mechanism regardless the source of the access, the core or the debugger module. Therefore, without any preventive actions, the Kinetis cache may be easily polluted with debugger accesses to the memory (reads or writes). In order to solve this problem, additional algorithms were developed to avoid the debugger intrusiveness and also to maintain the cache coherency.

For example, the biggest performance degradation may be observed when stepping having the cache enabled and a memory monitor for a Write-back region active. At each step, besides the time required to populate the cache view, the user may experience a bigger latency of about 4 times longer. This latency is the result of the debugger intrusiveness prevention algorithm that is activated when accessing cacheable regions. The algorithm is so slow because it performs cache searches for all the data that is read or written by the debugger. A workaround for this performance issue is to flush and then deactivate the cache before stepping with a write-back zone memory monitor active.

Another Kinetis specific issue is that the cache is not automatically flushed or cleared at a normal reset. This is done only at a Power-On reset. Therefore, if the user wants to use the cache in a debug session by enabling it using the Cache Viewer, the user shall also perform an Invalidate command right after the Enable Cache command.

Figure 1. Debug Perspective - Cache and Memory View
Debug Perspective - Cache and Memory View