Analyzer Window

Analyzer ››
Parent Previous Next

Table of Contents

1        Introduction        2

1.1        Related Documents        2

2        Concepts        3

2.1        File Storage        3

2.1.1        Support for legacy .CCV coverage documents        3

2.1.2        Host Performance        4

2.2        Triggers        5

2.2.1        Analyzer Configuration List        5

2.2.2        Adding a new trigger        6

2.3        Trace        6

2.3.1        Configuration        7

2.4        Profiler        9

2.4.1        Contexts        9

2.4.2        Configuration        9

2.5        Coverage        16

2.5.1        Configuration        16

3        User Interface        19

3.1        Window Elements        19

3.1.1        Main Toolbar        19

3.1.2        Status        23

3.1.3        Markers        24

3.2        Timeline        25

3.2.1        Columns        26

3.2.2        Context Selection        26

3.2.3        Event Display        26

3.2.4        Zoom        26

3.2.5        Result Filtering        26

3.2.6        Event Tracking        27

3.2.7        Find        27

3.3        Statistics        29

3.3.1        Event Display        29

3.3.2        Area Information        30

3.4        Trace        33

3.4.1        Toolbar        33

3.4.2        Columns        34

3.4.3        Trace Items        34

3.4.4        Find        37

3.5        Coverage        38

3.5.1        Toolbar        38

3.5.2        Columns        39

3.5.3        Navigation        42

4        Export        43

4.1        Trace        43

4.1.1        Text Format        43

4.1.2        CSV Format        44

4.1.3        Binary Format        44

4.1.4        XML Format        46

4.2        Profiler        47

4.2.1        Text1 Export Format Specification        53

4.2.2        XML Format        62

4.3        Coverage        63

4.3.1        XML Format specification        64

4.3.2        HTML Format specification        64

5        Appendix        65

5.1        Trigger Templates        65

5.1.1        Creating a Template Trigger        65

5.1.2        Editing an Existing Template Trigger        66

5.1.3        Using a Template Trigger        67


The Analyzer is used to capture real-time activity of an application. Based on raw capture of the trace stream, trace, profiler and coverage functionality is derived, all from a single recording.

1.1Related Documents

Profiler Concepts manual explains the theory of profiling in much more detail.

Emulation Technical Notes for the respective CPU provide more specific information; especially the trace trigger and qualifier configuration is highly CPU dependent.

IO Module manual explains how the I/O module (optional on iC5000) can be used to capture external digital and analog signals along with CPU trace.

Reference manual of the respective CPU should be considered to gain understanding of the on-chip trace capabilities.


2.1File Storage

The Analyzer configuration and recorded data is stored in a single file with a .TRD extension.

There is no limit to the number of open Analyzer files, but the View/Analyzer shortcut always opens the TRD file carrying the name of the winIDEA workspace.

For full functionality, the Analyzer requires:

Program code

Symbolic information

OS configuration files

Application source code

To facilitate synchronization of all these, the .TRD file can embed them along with the raw trace recording.

When a live Analyzer session is started (the green begin button), the current symbol and source image (all download files and all source files) are copied into the .TRD file.

When a re-analyze (the blue begin button) is started, the image from the .TRD file is used – this ensures that the recording is in sync with symbols and sources.

When a .TRD file is opened, the reanalyze is initiated automatically.

Note: when a .TRD file is included in a support file, all source code is stripped automatically, but the program code and symbol table remain, as they are critical for correct trace reconstruction.

2.1.1Support for legacy .CCV coverage documents

Coverage recordings were saved in a file with .ccv extension in winIDEA up to the version 9.12.121. These files can no longer be opened in winIDEA 9.12.234 or later. In order to view the coverage data the file must be converted to a .trd file.

In order to convert the .ccv file to .trd file simply change the file extension to .trd. Such file, when opened in winIDEA version higher than 9.12.121, will be converted to a newer .trd format and settings that were specified in the old .ccv file will be preserved. Such file can then be used with newer winIDEA to run coverage with the same settings as were specified in the old .ccv file.

2.1.2Host Performance

Analyzer recordings can be very large (as set in the Buffer Size field). Most of this data is kept on disk in the designated temporary folder. If another more convenient disk is available, the environment variable ISYSTEM_TEMP can be set to the folder with better performance.

Profiler and coverage analysis are parallelized and a multi-core PC will increase overall processing performance.

Recommended system configuration:

4-8 core CPU

8GB or more RAM

100GB free space on the TEMP drive. The drive should be defragmented.


A trigger is a set of configurations for the on-chip trace, the emulator and winIDEA. One Analyzer document can store several different trigger configurations, which can be easily switched.

2.2.1Analyzer Configuration List

Analyzer Configuration List window can be found in the Analyzer Configuration drop down menu, by selecting Edit Analyzer Configuration List… option. Dialog displays all trigger items in the analyzer document.

These operations are available:

New – adds a new trigger

Edit – edits the selected trigger

Rename – renames the selected trigger

Set Active – the selected trigger will be used when Begin button is pressed

Copy – copies the selected trigger into a new one

Delete -  deletes the selected trigger (not available on Record Everything trigger)

Default Trigger Configuration

Every analyzer document comes with Default trigger. It is preconfigured to record all program trace and depending on CPU trace architecture also data trace.

Note: on on-chip trace architectures, data trace is not recorded to prevent on-chip-trace FIFO overflows.

This trigger item cannot be removed.

Hardware specific configuration is accessible via the Analyzer Setup button.

Analyzer Setup dialog

Analyzer Operation mode offers to select either one of the trace ports, on-chip trace buffer or one of other trace methods provided by iSYSTEM tools (user trace port, software trace, IO module trace). Read more about the operation modes in the Trace Concepts chapter.

Cycle duration can be set when using the on-chip trace bufer, where timestamps are generated by the CPU as a number of CPU cycles. Exact duration of one CPU cycle must be entered for correct timestamp information.

Analyzer relies on application information obtained from the download file, which was used to load the application to the memory. If executed code is not a part of the download file, analyzer will report Missing code error, which tells the user that the application on the given address was not part of the download file. Option Code missing from download file should be read at run time enables analyzer to read the missing code directly from the memory.


2.2.3Adding a new trigger

New trigger is added from the Analyzer drop down menu option “Create New Configuration” or by choosing “New” in the Analyzer Configuration List.

The New Configuration window offers basic settings for the new trigger:

Name – unique name for the trigger

Analysis – checkboxes to enable Profiler and Coverage analysis are available. Trace is always performed by default.

Hardware Trigger Configuration – offers several different ways to set up the hardware trigger:

Automatic – winIDEA determines best configuration for desired analysis, while considering CPU trace architecture limitations.

Wizard – After creating the new configuration, Trigger Wizard is started that enables the user to create most commonly used triggers.

Program flow + Instrumentation – trigger is set to record program flow.

Manual – Trigger is enabled and manual trigger / recorder configuration is enabled. User must configure the trigger manually.

Template – Opens the list of available templates, from which user can select the desired template and modify template’s parameters.


Trace provides deepest possible insight in the real-time workings of the application. The quality of captured information depends on trace capabilities of the CPU and the used trace tool, and can include:

Program trace

Data trace

Instrumentation trace

Auxiliary I/O trace

Trace view reconstructs the low level captured events to:

Program counter movement

Instruction flow on assembly level

Source code flow

Data variable read/write accesses

Labels (association between the address and a symbol obtained from the download file)


Trace configuration is CPU specific as most setting depend on the trigger logic implemented inside the CPU (refer to emulation technical notes for information on specific CPU).

Since on-chip-trace configuration is typically very complex, winIDEA provides shortcuts to typical scenarios:

Trigger Wizard

Trigger Wizard provides a few simple steps to configure most commonly used:

Triggers – on variable access or function execution

Qualifiers – record program flow, data accesses, etc.

Wizard is initiated either when Wizard is selected as the Hardware Trigger selection, or by clicking the Wizard… button in the trigger configuration dialog.



When Wizard finishes, the trigger configuration is set according to the CPU’s trace module and the wizard selections.

Note: previous trigger configuration is overwritten by the wizard configuration.  

Trigger Templates

Some on-chip trigger systems are very complicated to set up. A library of trace trigger templates can be created by a specialist and shared with other users who can quickly create and manipulate a trace trigger configuration for a typical task.  

A template trigger provides a level of abstraction on top of the trigger configuration, which makes its use quite straightforward. The idea is:

An experienced user familiar with on-chip trace engine, can create and test a specialized trigger configuration; for example – record all accesses to variable MyVar while executing function MyFunc.

Once the configuration is tested, only a few parameters are marked as configurable (e.g. name of the variable and the function), and the configuration is saved as a template.

The template is shared between other users, who can use it for this purpose, but need to specify only the configurable parameters.

Any number of templates can be created.

For more information see Trigger Templates on page 65.


Profiler provides a higher level view of the trace information. Program counter flow and memory accesses are decoded to these event types:

Function execution

Data State changes

Data acquisitions

External I/O data acquisitions

Task and IRQ context switch events

Event flow is displayed in the timeline view, global statistics are displayed in the statistics view.

Note: see Profiler Concepts manual for technical background on how the profiler works.


In a multitasking OS application, the OS switches the context (register set, stack) for every task and possibly also for IRQ servicing. To correctly analyze program flow and provide correct statistic information, the analyzer must be aware of how context switches are signaled.


Profile Code

To profile functions and program lines, check the Profile/Code option.

Code Area

In the Code Areas list, add all functions which should be profiled by clicking the New button.

Note: if Code Areas list is empty, but Code Profiling is checked, all functions will be profiled.

The leading check box can be used to disable an already configured area.

Multiple functions areas can be specified with a wildcard.

*         any sequence of characters

?         any single character

#         any single digit

[set]         any of the characters in the set

[!set] none of the characters in the set

A set is defined as a sequence of characters. If a dash is used, all characters within the range qualify, e.g. [a-z] defines all characters in range a to  z.

In this example all functions which start with CAN_ are profiled.

To get the analysis of the function’s lines, enable the Include function lines option.

Note: To find a specific function easier, the function configuration list provides a filter field. The filter accepts whitespace separated character sequences. The list will show only functions where all character sequences are found.

Profile Data

To profile data objects (variables), check the Profile/Data option.

In the Data/Areas list, add all variables which should be profiled by clicking the New button.


This field is optional. If specified the given description will be shown in the Analyzer window. Otherwise the variable name will be displayed.


Specify the variable to profile.

If the CPU implements instrumentation trace (e.g. OTM), the instrumentation message can be configured by setting the combo to the message.


For typed variables, leave this setting to Entire object. If a memory location is to be profiled, the size in bytes is specified here.

Use only

If a data object is composed of several distinct sub-items (bit fields,…), a part of the data object can be extracted by specifying its bit size and bit offset within the parent data object.

Several data objects can be defined on top of a parent data entity.

Note: refer to Profiler Concepts manual for more information especially on big endian systems.

Value display

The values written to the variable can be displayed in these number formats:

Hexadecimal (default)



Value alias definition

Values written to a variable can have a symbolic equivalent. The analyzer can map the literal values to symbolic names found in

enum type

include file containing #define statements


A profiled variable can be treated (and different statistics derived ) as:

State variable. For every value written separate statistics and timelines are maintained. Example: active task ID.
Note that the amount of states is limited by the Max number of states setting to prevent excessive memory usage.

Regular variable. Values written will be displayed in an oscilloscope format. Example: result of A/D conversion.

Area entry/exit identification. The values are interpreted as signals of function entry and exit. This can be used if no program trace is available, or exact measurement of small number of areas over a very long time period is required. Program code instrumentation is required. Refer to Profiler Concepts manual for more information.


Minimum and maximum range for regular variables can be specified here. The timeline view will scale the recorded values within this range.

If no values are given, the result is scaled to minimum and maximum values detected in the current session.

Profile OS objects

To profile OS objects, check the Profile/OS objects option.

The objects and their locations are specified by the OS plugin. Further configuration should be performed there. Click the OS Setup… to configure the operating system.

Profile AUX

To profile AUX lines, check the Profile/AUX option.

The AUX inputs are configured in the hardware specific configuration. Further configuration should be performed there.

Advanced configuration

The Advanced button provides further configuration options:

Tail Optimization Analysis

Compilers perform several kinds of optimizations on the function exit code. Profiler must recognize:

tail-merge: this optimization effectively moves part of function (A) code body into another function (B). In range mode, execution in function B would be attributed to function B, instead of the optimized function A.

tail-call: this optimization occurs when function(A) calls another function(B) just before it exits. Instead of using a call op-code, a branch is used. When function B returns, effectively function A returns too.

If this option is enabled, profiler performs analysis of tail optimizations on the fly. This analysis requires a higher level of debug information quality and it relies on object code analysis. If the analysis algorithm fails, profiler session aborts. In such case the tail optimization analysis can be disabled to revert to regular range mode.

Stack Killers

Functions which terminate the task stack should be specified here. Usually this will be just the central OS scheduler function. If there are multiple, separate them with commas.

Allow functions without exits

When checked, profiler allows functions without exits (typically main on embedded applications).

Ignore functions which exit on entry

When checked, winIDEA will ignore (instead of reporting an error) functions whose only instruction is the return statement.

This option is applicable only in Entry/Exit mode.


Some CPUs implement a dedicated signaling mechanism which the application can use to communicate to the tool. Normally such messages carry 32-bit data, which is sufficient for most uses, but some CPUs provide just 8-bits data. In such case the application can use an encoding protocol to pass larger data in a safe manner to the tool.

Several OSEK operating systems use this method on MPC5xxx CPUs as a standard extension. In such case the profiler will automatically use the correct encoding and no further configuration is required.

Refer to Profiler Concepts manual for more information.

Prefix hexadecimal values with 0x

When hexadecimal values are displayed, the 0x prefix can be prepended.

Ignore context reactivation

Per default the profiler ignores repeated task IDs and considers that the same task/IRQ continues without interruption. Optionally, re-activation can be considered as a separate activation of the context. The changed interpretation will be reflected in the statistics for the task.

Discard execution in unknown context

Before a Task/IRQ context is known, the code execution can be ignored. The recorded code is attributed to an unknown context which doesn’t exist in the application.

If on the other hand the application is profiled from CPU start, the OS is not active yet and the unknown context is a valid state, the option should be disabled.

Merge data areas

Per default the analyzer will attempt to use one hardware data comparator per data area. On most on-chip trace systems, the number of these comparator is small and insufficient.

The profiler can merge multiple data areas in a single range, thus recording many using just a single comparator.

Note: If the data areas of interest are far apart in terms of address locations and frequently written variables lie between them, this technique will increase the data trace bandwidth and could lead to trace port overflow. In such case the data variable layout in the application could be rearranged so that variables of interest are closer together.

Start at

The starting point can be used to start the profiler when the application executes from the specified address. You may use this option to synchronize multiple profiler sessions on the same event.

Analyze only events after start point

Profiler will per default analyze the entire session, including events which occur before the starting point. To limit the analysis to events after starting point, set this option.

Limit session duration

The profiler session duration is limited to the specified time when checked. After the specified duration, the profiler analysis is stopped.

Ignore unknown functions / variables

When checked, winIDEA will ignore (instead of reporting an error) functions or variables who were for instance manually defined, but no match can be found in the symbol table.


Coverage provides a higher level view of the trace information. Program counter flow, including the information about conditional op-code execution, is used to derive these types of execution coverage information:

Statement coverage (object and source code level)

Function coverage

Call coverage

Condition coverage (object code level)

Decision coverage (object code level)

For more theoretical information about coverage, see Coverage Concepts manual.

Coverage statistics are displayed in the coverage view.



The Scope of coverage – the memory regions which will be covered – should be selected first.

All Downloaded Code – all memory regions where code was loaded during downloaded will be covered. Individual regions can be excluded with explicit specification in the list.

Entire Memory – entire memory will be covered. Individual regions can be excluded with explicit specification in the list.

Custom – the list in this case specifies the regions which are included in the coverage.


The Include/Exclude list allows specification of these types of regions:

Image – the code from the specified download file

Folder – all modules from the specified folder are included

Module – the code from the specified module

Function – code belonging to the specified function

Range – manually specified range

Assembler Information

If this option is checked, then coverage statistics are performed also for individual instructions of a source line.

Manual Trigger/Recorder configuration

This advanced setting allows custom configuration of the on-chip-trace logic. It should be used only when the automatic configuration isn’t sufficient. For example: when a device with limited on-chip-trace buffer is used, the trace message generation can be limited to execution only within a certain function. This then allows much longer session times.


Click to configure the trigger.


Click to select one of the preset templates (built-in or user).

3User Interface

3.1Window Elements

Analyzer window can show several aspects of the recording in different views, which can be toggled from the main toolbar.


3.1.2Main Toolbar

The main toolbar provides access to Analyzer configuration and to toggle display of specialized views.

Analyzer Configuration

Analyzer configuration drop down menu offers selection and manipulation of trigger configurations and a shortcut to Analyzer Setup window. Analyzer Configuration window combines trigger, profiler and coverage settings. Analysis and Configuration block on the Hardware page offers selection of Profiler and Coverage analysis. If Manual Trigger is selected, manual trigger settings are used; otherwise optimal settings are determined based on the analysis selection.

Real-time coverage selection enables online coverage mode, which is available on certain ActiveGT targets.

Reactivate session after CPU stop - Analyzer recording is always stopped when CPU stops, for example when BP is hit. With this option set, winIDEA will reactivate session before CPU is set to run again. This scenario is typically used with coverage type analysis. Results are accumulated between reactivations.

Hardware page also provides list of Analyzer properties, which can be set for specific trigger configuration. These are:

Start – select, where the trigger should occur. Possible options are:

Trigger Immediately – trigger occurs immediately when the recording is started. Recording is stopped once the trace buffer is full.

On Trigger – use the trigger settings. Recording is stopped once the trace buffer is full.

Continuous mode – sampling is active until CPU is stopped. When the trace buffer is full, old records are overwritten with new records.

Buffer Size – maximum size of the analyzer file. Either choose one from the drop down menu or manually enter the preferred size. When recording is started, data is written to the user's temp folder. Read Host Performance chapter for more information about moving the temp folder location. When the buffer is full (reaches its maximum allocated size), recording stops. This is analyzer buffer and should not be confused with the on-chip trace buffer.

Note: Buffers size specifies maximum amount of data to be recorded. It will be less, if emulator trace buffer overflows. In continuous mode it will never be more than emulator buffer size.

Trigger Position – select the amount of samples before trigger position.

Stall CPU – set to prevent on-chip module FIFO overflow.

Break On Trigger – stop CPU on trigger.

Read the Profiler and Coverage chapters to learn more about the Profiler and Coverage tabs.

Begin New Session

A new live analyzer session is started. The acquired data is loaded from the emulation hardware on the host PC and analyzed.

The live session will end when either of these situations occurs:

Emulator trace buffer is full

CPU stops

The maximum amount of acquired data is reached

Session is aborted with the Stop button

Re-analyze Last session

Previously acquired analyzer data can be analyzed again.

This off-line analysis is started automatically when the analyzer document is opened.

Manually the analysis can be started to gain more information from an existing session, e.g. add or remove some functions from the profiler.

The off-line analysis can be stopped with the Stop button.


Stops the active session – live or off-line.

Note: in a live session, Stop will on first click stop data acquisition on the emulator. Upload of the trace buffer and analysis of the loaded data will continue. A second click will stop this too.

View Trace

Toggles the Trace view display.

View Timeline

Toggles the Timeline view display.

View Statistics

Toggles the Statistics view display.

Auto Synchronize

If this button is pressed, caret marker position between trace view and timeline view is synchronized. If moved in one view, the other view will scroll to the new position too.


Opens the Options dialog.

General page offers these settings:

Trigger Status bar – toggle the display of the Watchdog and Duration tracker.

Auto save when saving workspace – check to automatically save the analyzer document when the workspace is saved.

Save symbol image – save the symbol image in the analyzer file.

Save source code – save source code in the analyzer file. Option increases the analyzer file size, but is useful when reviewing the analyzer recording where the source files are not available.

Save recording – save the recording to the analyzer file.

Colors page offers custom color selection for the analyzer window.

Trace page offers settings for displaying trace records. Only records matching the checked selections will be displayed in the trace display.

Profiler page offers these settings:

Auto Scale AUX – check to automatically scale analog AUX inputs between lowest and highest recorded value.

Display download file – check to display the download file with the area name. Useful when areas with the same name are present in multiple download files.

Max number of states – the amount of states allowed for a state variable is limited to prevent excessive memory consumption.

Coverage tab offers several settings for coverage display and HTML export.


Every document can have additional user information associated.


When a session is activated, the analyzer will start and pass through several stages.

Waiting for trigger

In this state, the analyzer hardware is already recording data in a circular buffer, but none of this data is shown until a trigger event occurs. The status will show a green WAITING status.

Active recording

When trigger occurs, data recording continues until the hardware trace buffer is filled.

The Brown/Yellow rectangle indicates the hardware buffer state. The brown region shows recorded data, the yellow region the free trace buffer. Since the recorded data is uploaded on the fly, the trace buffer can possibly sustain recording the data indefinitely. The number on the right indicates the number of recorded samples.

The Blue/Gray rectangle indicates the allocated file state on the PC. The shown percentage is relative to the maximum allocated file buffer size, which is specified in the Recorder settings for each individual trigger:


When the trace buffer is filled (no more yellow color in the status), data recording is stopped. Data upload from the trace buffer to the PC file continues until the entire data is uploaded.

Maximum amount of the recorded data depends on the trace data rate compared to data upload rate. It is therefore between trace buffer size and trace data file size limit.


Analysis is started as soon as the records are uploaded to the PC. Analysis process is indicated in the analyzer status bar:

During the analysis the analyzer is still running, even though the recording can be stopped (by hitting the stop button or stopping the CPU). To abort the analysis hit the stop button the second time.


Three Markers are available in the timeline and trace view:

Caret – positioned with left mouse click

Marker 1 – positioned with Ctrl + left mouse click

Marker 2 – positioned with Ctrl + right mouse click

Status bar will show positions of markers:


Marker 1

Marker 2

distance from Marker 1 to Marker 2. The frequency is displayed in parentheses as inverted time distance.

Keyboard shortcuts

Ctrl + J

Go to Trigger position

Ctrl + 0

Go to Caret

Ctrl + 1

Go to Marker 1

Ctrl + 2

Go to Marker 2

Ctrl + Alt + 1

Set Marker 1 (at current caret position)

Ctrl + Alt + 2

Set Marker 2 (at current caret position)

Trace/Timeline Synchronization

Trace view can be synchronized to the position of the caret in the Profiler timeline view.

To synchronize manually, hold down the Shift key when placing the caret pointer.

To synchronize automatically on every caret move, press the toolbar button.


Timeline view is toggled with the toolbar button.

Color coding




Block / Red

Function, State

The area is active (function executing in body, state is active)

Block / Light red

Function, State

The area is suspended (function called another function, a task was preempted)

Block / Blue

Function, Line, State

Multiple activities within a single display pixel. Zoom-in to get more information.

Block / Brown


The line is active (executing in body)

Block / Light brown


The line is suspended (called another function)

Block / Green


The State variable is in the neutral state (no task is executing).

Flow / Red + Green

Data (state type)

The data variable of state type shows state flow. Red lines indicate active states, green the neutral state.

Flow / Blue


A data variable of regular type, state variable without neutral value, AUX input.


The first column (Code/Data/AUX) displays the area name. Clicking on the column header sorts the areas by their name.

The Value column displays the value of the Data/AUX area at the caret position.

The History column displays event time flow. Clicking on the column header in the Code pane sorts the areas by call stack sequence at the caret position.

3.2.2Context Selection

Context selection is available if context switches have been detected (typically in an OS application).

The context is selected by dropping the toolbar button or via window’s context/Context  menu.

3.2.3Event Display

Function, Data and AUX events display can be toggled using toolbar buttons.


Time zoom can be set using:

toolbar buttons

via context menu

Ctrl + mouse wheel

via keyboard shortcuts:

Num +

Zoom In

Num -

Zoom Out

Num *

Zoom All

3.2.5Result Filtering

Code items displayed in the timeline and statistics view can be filtered using wildcard expressions in the Filter bar.

To open the filter bar, click the icon in the toolbar, or Filter command from the context menu.

Filter expression uses case insensitive wildcard format.

Several expressions can be specified by delimiting them with space characters; e.g. to display only functions starting with OS or CAN, specify the filter: OS*  CAN*

Filtering is applied as long as the filter bar is open. When closed, all areas are displayed again.

Note: filter expressions are synchronized between profiler timeline and statistic views.

Hide items with no activity

Press the Filter bar’s button:

3.2.6Event Tracking

Caret Marker can be moved to the next or previous event using these keyboard shortcuts:


Go to Next Event


Go to Previous Event

Only events in the selected context are tracked.


The Find dialog is opened with the Ctrl+F keyboard shortcut or via the find toolbar button.

Find functionality is enabled when an area is selected in the left column.

If an event is found, Marker 1 is positioned to beginning of the vent and Marker2 to the end of the event. If timeline toolbar’s button is checked, the located area will be scrolled into view.

Following searches are available:

Occurrence of an event –a write to a data area or entry into a function

Duration of an event (longer or shorter than specified value)

The specified time can use these units: ns, us, ms, s

Search for function areas can use function event Net, Gross or (in an OS), Call time

Keyboard shortcuts


Open Find dialog


Find Next


Find Previous

3.3Profiler statistics

Statistics view is toggled with the toolbar button.

Any statistical aspect of the profiler session can be displayed as a column in the statistics view. Column selection is available via button or by right-clicking the header line (below the toolbar).

To sort the contents by a specific statistic criteria, click the respective column header.

Color coding




Block / Red

Function, State

Default – measure for the respective column.

Block / Light red

Function, State

Gross time in the mixed Net/Gross display.

Block / Brown


Default – measure for the respective column.

Block / Light brown


Gross time in the mixed Net/Gross display.

3.3.1Event Display

Function, Data and AUX events display can be toggled using toolbar buttons.

3.3.2Area Information

Profiler session statistical information for an area is available, by right clicking on an area and selecting Properties.

The arrow buttons can be used to position the timeline view to the point where a specific event occurred.

This information is available for different areas and is specified in Profiler Concepts chapter.

Profiler statistics for function f, measured in the OS task OS_Task_A


Trace view is toggled with the toolbar button.


       Trigger Status Bar – opens a hardware trigger pane. Applicable only on ICE architectures.

       Export – exports the trace

       Configure Signals – opens a dialog for Signal configuration

       Configure States and Filters – opens a dialog for State and Filter configuration


Location box shows the function of the execution location at caret position. The two buttons will move the caret, considering the recorded program flow, in the following fashion:

       GoTo First Function Sample

Caret is moved to the first op-code executed in the current function body – i.e. when execution entered the range, either as it was called or a called function returned to it.

If the caret is already located on the first sample in a function, then the caret is moved to the previous last sample in the current function – i.e. when the function exited or called another function.

       GoTo Last Function Sample

Caret is moved to the last op-code executed in the current function body – i.e. before the execution moves to a different function wither as a call or return.

If the caret is already located on the last sample in a function, then the caret is moved to the next first sample in the current function – i.e. after the called function returns or the function is called again.



This is the sample number, relative to the trigger position. On on-chip traces where multiple actions of the CPU are packed into a single message, the number is encoded as


Sample is the message count since the trigger.

Subsample is the index of the CPU activity within a single message. In this picture, where on-chip-trace data display is enabled, it is shown how a Nexus IBHM message is decoded into program flow.


This is the associated address of a trace item:

Op-code address

Address of data access

Address or part of address encoded in the trace message


This is the associated data of a trace item:


Value of a data access

Data or part of data encoded in the trace message

If only a part of data is valid (e.g. 8-bit data access on a 32-bit CPU), the ignored bits are displayed grayed.


This column displays the decoded trace item in readable form. The text depends on the trace item class.


Displays time of the trace item as

absolute (distance from trigger)

relative – the ‘duration’ of this item, which is defined as time difference between this item and then item of the same class.

3.4.3Trace Items

Trace data is acquired on the lowest level onnection between the CPU and the emulator. The data obtained there is analyzed and higher level of information is then available for display in the trace view.

On-Chip Trace Data

If an on-chip CPU trace is used, the acquired data is a stream of messages transmitted over the trace port. Since OCT messages are compressed, the information here is usually of diagnostic value only. The analyzer will decode these messages and generate a sequence of bus activities.

Bus activity

In an in-circuit emulation system, this is the lowest level of obtained data, whereas an OCT system decodes this data from the OCT messages.

Depending on the used CPU, different bus activities are available. Typically the Instruction is always available on most CPUs also Write and Read.


Auxiliary signals (digital and analogue) attached in parallel to the CPU trace.


Disassembly information is derived from Instruction bus activity. The address flow and op-code data is disassembled.

Source Line

Source line information is derived from Instruction bus activity. The address is looked up in the symbol table and the source line text is obtained from the source code file itself.

If the source file is not found, only the name of the file and the line number is displayed.


Label information is derived from Instruction, Write and Read bus activities. The address is looked up in the symbol table and if a matching symbol is found, its name is displayed.

The symbols will typically be:

Program functions and their exit points

Global code labels

Variables with global lifetime (global, file or function static)

Function Data

Label information is derived from Instruction bus activities, similar to Label information. The trace tries to match function entries and exits to generate a function call tree.

Selecting Items for display

To select display items, click the main toolbar icon in the toolbar and select Trace pane.


The Find dialog is opened with the Ctrl+F keyboard shortcut. The search operates by string search in the Content column.

To search using a more complex expression, select the Logic signal expression and configure the search expression using available data and operators.


Coverage view is toggled with the toolbar button.


       Accumulation – when pressed, the previous results are not cleared when starting a new session.

       Export – exports the coverage data

 Finds an item by name

       Filter – select items to show/hide. Manual selection allows individual item selection.

Columns – select columns to display.



This column displays the coverage items as they were resolved from the configuration.

On the root these two nodes are shown:

Images – expand to see all download file images included in the analysis

Ranges – expand to see all memory ranges included in the analysis

An Image expands to show these nodes:

Ranges – memory ranges which this image loads code to

Folders – if the Folders are shown in the Filter selection, then all folders found are displayed. These in turn will expand to show source file modules found in each of them.

Modules – source files included in the analysis

A Module expands to show its functions and source lines found outside function body.

Function will expand to show its source lines.

A source line will expand to show its memory range, or (if Assembler Information is enabled) its op-codes.

Statement (lines)

This column displays statement coverage information on source line level.

The first number shows the number of lines executed.

The second number shows the number of lines detected in the item.

Statement (object)

This column displays statement coverage information on the object code level.

The information is shown in MAUs (memory accessible units). This corresponds to number of addresses covered by an item.

The first number shows the number of MAUs executed.

The second number shows the number of MAUs detected in the item.

Condition coverage outcomes

This column displays condition coverage information on object code level.

The information is shown in number of conditional instructions.

A single conditional instruction can have two outcomes (true or false).

If during the session both outcomes were detected, the result is marked as both.

A range with multiple conditionals will show

number of conditionals with only true outcome

number of conditionals with only false outcome

number of conditionals with both outcomes

The next number in the column specifies the number of conditional op-codes detected in the item. Percentage indicates the ratio between paths taken and paths possible. For example, if there is one conditional statement and one path was taken (e.g. true), the conditional statement will be 50% covered. If both, true and false, paths were taken, the conditional statement is 100% covered.

Execution counts






Number of executions of an op-code with minimum and an op-code with maximum number of executions



Number of functions executed/detected in the module


Number executions of this function


Number executions of this line



This column displays the number of call executions in an item.

The information is shown in number of executed/detected call op-codes.


To display the source code and disassembly code for an item, double click on the item.


Export functionality is available once all the recording has been processed.


Click on the Export button on the Trace toolbar (marked on the image below).

Choose desired export settings in the Export dialog:


Specifies the file path to which the data is exported.


Specifies the export format.

Click Options… button to configure format specific settings.

Launch Associated Viewer

If checked, the system associated viewer application is launched after export.

Time Scope

Defines the time range for timeline export.

Entire session: Exports all recorded events

Between markers: Exports only events between the two markers. If the markers are not set, this option is not available

Selected signals only: Exports only columns, that are selected in the Trace window.

Use Filtered Samples: Only samples that are currently listed are exported.

4.1.1Text Format

Number of characters in Content Signal

Defines the width of the Content string.


Number                Address   Data      Content                         Time          


728525.0              0000210C  C13F2003  C13F se_lwz      r3, #04(r31)   518.552299 ms  


728525.1              0000210E  2003D13F  2003 se_addi     r3, #01        518.552317 ms  


4.1.2CSV Format

Field separator

Specifies the character to use to separate fields.

Default: comma

Export column headers

If this option is checked, the names of the columns are exported as the first line.



"728525.0","0000210C","C13F2003","C13F se_lwz      r3, #04(r31)","518.552299 ms",


"728525.1","0000210E","2003D13F","2003 se_addi     r3, #01","518.552317 ms",


Note: if a single sample generates multiple entries in the trace window, only the first entry will have a sample Number and Time value.

4.1.3Binary Format

Binary format is best suited for large exports and easy parsing for further processing.

File Layout

File layout is determined by the information which is exported and is shown in the Options dialog dynamically.

If File header is saved, the first byte will contain file version:

[00]        File version

[01]        Sample size in bytes

File Header

If File version is 2 or higher, 15 bytes Flags follow:

[02]        Number of CPU buses

[03..04]        Sample flags

 0x0001        Data present

 0x0002        Address present

 0x0004        Address Memory area present

 0x0008        Time stamp present

 0x0010        Bus Status present

 0x0020        Sample index present

 0x0040        On-Chip trace data present

 0x0080        Aux present

 0x0100        Sample flags present

 0x0200        Trace Source ID present

[05]        Number of MAUs

[06]        Number of bus status bytes (1 bus status byte per MAU)

[07..10]        Reserved

Array of Samples

Every exported sample starts with following layout. Each information field may be enabled or disabled through the Binary Export Options (as seen on the previous image).

[00..01] Sample flags

[02..09] Sample index

[0A..0B] Subsample index

[0C..13] Time in ns

[14]        Trace Source ID

Trace Sample:

[15..18]        Address

[19..1A]        Address memory area

[1B..1E]        Data

[1F..22]        Bus status flags

         0x01 Instruction

         0x02 Single Data (Instrumentation message without address)

         0x04 First MAU of instruction

         0x08 Data not valid

         0x10 Info

         0x20 Write

         0x40 Address not valid

         0x80 Single Data 1 (Instrumentation message with address)

On chip data sample:

[15]        Num items (N)

[16]        Item 0 bitsize

[17]        Item 0 value (len = bitsize)

... Up to item N-1

Aux sample:

[15]        Num items (N)

[16]        Item 0 bitsize

[17]        Item 0 flags

               0x80        Analog (otherwise digital)

               0x40        Output (otherwise input)

               0x3F        Analog channel number mask

[18]        Item 0 value (len = bitsize)

... Up to item N-1

Sample flags use same encoding as in the Header, but for every sample only the sample relevant fields are present.

On chip data samples use a variable encoding due to their dynamic nature. The number and meaning of individual items depends on trace protocol and usually the leading item.

<Num Items>                = BYTE, number of message items

<Item 1 bit size>        = BYTE, size of item data in bits

<Item 1 data>                = minimum number of BYTEs required for bit size. LSB order


<Item N bit size>        = BYTE, size of item data in bits

<Item N data>                = minimum number of BYTEs required for bit size. LSB order

For bit-size 1-8 one BYTE data is used, for 9-16 two BYTEs, 17-24 three BYTEs etc.

AUX samples use the following encoding:

<Num Items>                = BYTE, number of message items

<Item 1 bit size>        = BYTE, size of item data in bits

<Item 1 flags>                = BYTE, description of the item

<Item 1 data>                = minimum number of BYTEs required for bit size. LSB order


<Item N bit size>        = BYTE, size of item data in bits

<Item 1 flags>                = BYTE, description of the item

<Item N data>                = minimum number of BYTEs required for bit size. LSB order

Note: Analog item data is 32-bit IEEE-754 float, also in little endian format.

4.1.4XML Format

XML format is best suited for smaller exports and further processing, e.g. for printing or HTML transforms.

XML schema can be found in templates\TraceExport\TraceExport.xsd file in the winIDEA installation folder.

The XML property names used are displayed in the dialog’s information pane.






     <TIME>518.552299 ms</TIME>

     <DASM>se_lwz      r3, #04(r31)</DASM>






     <TIME>518.552317 ms</TIME>

     <DASM>se_addi     r3, #01</DASM>







Launch Associated Viewer

If checked, the system associated viewer application is launched after export.


Specifies the file path to which the data is exported


Specifies the export format. Click Options… button to configure format specific settings.



Exports statistic information for every area:

Total time spent in the area

Number of activations

Percentage of the time spent in the area vs. total profiler session time

Minimum, maximum and average activation time (net, gross and call times).

Minimum, maximum and average time between consecutive activations (period).


Exports timeline information for every area:

Data and AUX areas: time and value written

Functions: time and state of the area (active + nesting level)

The time range for timeline export can be:

Entire session        Exports all recorded events

Between markers        Exports only events between the two markers.
If the markers are not set, this option is not available

Note: time range selection is available only for Text1 and XML formats. CSV and Text formats export all events.

Data Object Mappings

Exports a mapping table for data objects where such mapping exits (e.g. task IDs).

Note: this information is always present in Text1 and XML formats. It is configurable for Text and CSV formats.


Defines which area types will be exported.

Selection        Exports only area currently selected in the profiler window.

All Areas        Exports all areas

Filter        Exports only areas which match the specified filter wildcard pattern.
This option is available only for the Text1 and XML  formats.

Note: function lines are exported only if functions are also exported.

Export only active areas

Only areas with recorded activity will be exported.

This option is available only for XML and Text1 export formats.


If checked, Data areas are exported.


If checked, AUX areas are exported.


If checked, function areas are exported.

If Function Lines are checked, also the lines of functions which include lines area exported.

The Context selection is available if the profiler session recorded context switches. The selection defines the context for which the Function statistic data is exported. To facilitate understanding of differences depending on Context selection, consider this example:


void main


 StartTask(TaskA, entryTaskA);

 StartTask(TaskB, entryTaskB);

 while (1);


void entryTaskA()


 while (1)





void entryTaskB()


 while (1)






There are three contexts in this application:

Unknown – in which function main is executing.

TaskA – in which functions entryTaskA and f are executing.

TaskB – in which functions entryTaskB, g and f are executing.

Let’s assume 1000 executions of loop in entryTaskA and 500 in entryTaskB:


Statistics results for every context are exported individually.






































Same as All, but only results for the context which is currently viewed in the profiler window are exported.


For statistics, results from all contexts are accumulated.

























Text Export Format Specification







Execution Profiler Results

Begin: <begin time>

End:   <end time>


*********** Information: Data ***************

{ // repeated for every data object

<data object name>              |  Mapping


{ // repeated for every value

<value name>                    |   <value>




<Data Statistics>

<Functions Statistics>

<Data Statistics>

*********** Statistics: Data ***************

{ // repeated for every data object

<data object name>  |Time|%|Min|Max|Average|Activations|Min Period|Max Period|Average Period|


{ // repeated for every value

<value name 1>      |<results>                                                              |


---------------------------------------------------------------------------------------------  TOTAL:              |<results>                                                              |



<time>ns | <percentage> | <count> | <void>

<Functions Statistics>

*********** Statistics: Functions ***************

Functions           |Time|%|Min|Max|Average|Activations|Min Period|Max Period|Average Period|

---------------------------------------------------------------------------------------------{ // repeated for every function

<function stat>


---------------------------------------------------------------------------------------------  TOTAL:              |<results>                                                              |

<function stat>

{ // repeated for every context

[TASK: <context ID>]

<function name> |<results>                                                              |



<Data Timeline>

<Functions Timeline>

<Data Timeline>

*********** History: Data ***************

{ // repeated for every data object

<Data Object Timeline>


<Data Object Timeline>

<object name>


{ // repeated for every event

<event time>       | <value written>


<Functions Timeline>

********** History: Functions  **********

{ // repeated for every function

<function timeline>


<function timeline>

{ // repeated for every context

[TASK: <context ID>]

<function name> [


<event time> | <function state >


4.2.1Text1 Export Format Specification

Text1 format is highly configurable. In the Export dialog select Options…

Format of every emitted entry can be configured in this dialog.

Data from an exported entry is accessible as a macro enclosed in % characters.


If this option is checked, timeline is exported in a binary file with the same name as main file, but with additional .BIN extension. Binary format is much more compact than text export and is generated (and eventually parsed) much faster too.

The data layout in the binary file is for every event entry:


DATA: 4-BYTE, LSB order

TIME: 8-BYTE, LSB order

Every entry thus consumes 16 bytes.

Note: DATA for analog values is stored in 32-bit IEEE754 float format. | EVENT

32-bit integer, OR combining the HANDLE and the event type. Event type is encoded in bits 24-27:

For Functions, Lines and State variables these event types are:

Event Type


entry into the area, e.g. when a function is called


suspend of the area, e.g. when an Entered function calls another function


resume of the area, e.g. when execution return to the Entered function from the called function


exit from the area, e.g. when an Entered function returns


For Variables and AUX, the Event value is 0.

32-bit data associated with the event. Used with regular variables and AUX, 0 otherwise.

Signed 64-bit time in nanoseconds of the event occurrence.


If this option is checked, then write events into state variables are reported with state handle, otherwise with variable handle.


Write 5 into variable VAR





TimelineStates option checked:



TimelineStates option not checked:




Macros are predefined placeholders for the actual data. In the format text, macro must be enclosed in % characters.


the string format specified for an entry – this allows reporting the format that is used

Applicable: always


the class of the data entries listed. This can be




Function and Line class


Data and OS objects, regular and state


AUX inputs

Applicable: always


Name of the Context for which the items are exported. For areas other than functions, this macro is empty.

Applicable: StatisticsHeader, StatisticsEntry


a 32-bit hexadecimal value by which the entry/area is identified. Area type can be identified by the top nibble:









Variable state




AUX state


Applicable: MappingEntry, StatisticsEntry, TimelineEntry


HANDLE of the parent area. Applicable for function lines and data states.


HANDLE used by states. Applicable data variable of state type.


name of the area. Depending on area type these are:




Function name


Line string from source code


Variable name

Variable state

Variable state name (if any)


AUX name

AUX state

AUX state name

Applicable: MappingEntry, StatisticsEntry


NAME of the parent area. Applicable for function lines and data states.


the 32-bit hexadecimal value associated with the area. Depending on area type these are:




Line number

Variable state

Variable state value

AUX state

AUX state value

Applicable: MappingEntry, StatisticsEntry, TimelineEntry


the type of the area value. Depending on area type these are:



Analog AUX




Applicable: MappingEntry


the unit associated with the value. These are typically defined for analog AUX quantities only; e.g. “V”, “A”.

Applicable: MappingEntry


File path. Applicable for function lines.

Applicable: MappingEntry, StatisticsEntry


Number of recorded events.

Applicable: StatisticsEntry


Statistic time in nanoseconds for NET, GROSS, CALL and PERIOD categories and MIN, MAX and AVG criteria.




Total Net time


Minimum Net time


Maximum Net time


Average Net time


Total Gross time


Minimum Gross time


Maximum Gross time


Average Gross time


Total Call time


Minimum Call time


Maximum Call time


Average Call time


Minimum Period time


Maximum Period time


Average Period time


Total Outside time


Minimum Outside time


Maximum Outside time


Average Outside time

Applicable: StatisticsEntry


a character identifying the event type. For Functions, Lines and State variables these event types are:

Event Type


entry into the area, e.g. when a function is called


suspend of the area, e.g. when an Entered function calls another function


resume of the area, e.g. when execution return to the Entered function from the called function


exit from the area, e.g. when an Entered function returns


Character used for regular variables and AUX is W.

Applicable: TimelineEntry


time of the event occurrence in nanoseconds.

Applicable: TimelineEntry

Exported Items

Mapping Section

Mapping section provides links between profiled areas and handles by which these areas are identified in later sections. Handles are 32-bit hexadecimal values.

Areas are listed in this order:

<Mapping Header> // CLASS = ‘Functions’

{ // repeated for every function

<function entry>

{ // repeated for every function line. Functions with profiled lines only

<line entry>



<Mapping Header> // CLASS = ‘Data’

{ // repeated for every variable

<variable entry>

{ // repeated for every variable state. state variables only

<state entry>



<Mapping Header> // CLASS = ‘AUX’

{ // repeated for every AUX

<AUX entry>

{ // repeated for every AUX state. state AUX only

<AUX state entry>















10000008,g_cStaticSignal = STATIC;,7










Statistics Section

Statistics section provides statistic information for areas across the entire session.

Areas are listed in this order:

<Statistics Header> // CLASS = ‘Functions’

{ // repeated for every function

<function entry>

{ // repeated for every function line. Functions with profiled lines only

<function entry>



<Statistics Header> // CLASS = ‘Data’

{ // repeated for every variable

<variable entry>

{ // repeated for every variable state. state variables only

<state entry>



<Statistics Header> // CLASS = ‘AUX’

{ // repeated for every AUX

<AUX entry>

{ // repeated for every AUX state. state AUX only

<AUX state entry>














30000000,1,1,3491 for multiple contexts

For multiple contexts function statistics are exported for every context separately. The %CONTEXT% macro is adjusted accordingly.


* STATISTICS(Functions) CONTEXT(TSK: extendedTaskFirst) %HANDLE%,%VALUE%,%COUNT%,%T.NET%




* STATISTICS(Functions) CONTEXT(TSK: extendedTaskSecond) %HANDLE%,%VALUE%,%COUNT%,%T.NET%




Timeline Section

Timeline section provides full history of recorded event in ascending time order.

{ // repeated for every event

<timeline entry>









4.2.2XML Format

XML format is best suited for smaller exports and further processing, e.g. for printing or HTML transforms.

Mapping node exports all available properties of exported areas

Statistics node exports all valid statistical information

Timeline node exports all timeline information. Generating binary timeline is provided as an option.

XML schema can be found in templates\ProfilerExport\ProfilerExport.xsd file in the winIDEA installation folder.


Binary export is used for timeline. This reduces the output file size by a factor of 10 and export times by a factor of 150.


XML indent of 2 characters is used per default to increase readability. If manual review will not be used, disabling the indent will save 25% of file size.


Launch Associated Viewer

If checked, the system associated viewer application is launched after export.


Specifies the file path to which the data is exported


Specifies the export format. Click Options… button to configure format specific settings.


This option is enabled if for a certain export type several variants are available.






If modules (C source files) are to exported, further options apply:

Lines – statistics for module lines are included

Source Code – source code text is exported for every source line

Filter – a wildcard definition of modules to export. All modules are exported if left empty.


If modules (C source files) are to exported, further options apply:

Lines – statistics for function lines are included

Filter – a wildcard definition of function to export. All functions are exported if left empty.


Enable export of assembly level coverage information.


Enable export of range (address) level coverage information.

4.3.1XML Format specification

XML schema can be found in templates\CoverageExport\CoverageExport.xsd file in the winIDEA installation folder.

4.3.2HTML Format specification

HTML format is realized in the CTC++ format. The specification can be found here:


5.1Trigger Templates

5.1.1Creating a Template Trigger

Any regular trigger can be made into a template. Once a trigger configuration is operating correctly, click the Create Template button in the trigger configuration dialog.

The Template creation dialog will open. Fill in the name and description.

Notice that the Change items list displays all parameters which differ from the default trigger configuration.

Next, move the items which you wish to be configurable in the template, from the Changed items list to the Parameters list:

Select the item

Click the Add button

Specify the parameter properties which will be understandable to the template user


Review the configuration. Items in the Parameters list will be available for configuration. Their current  values will be used as default.

Click the OK button to finalize the template.

5.1.2Editing an Existing Template Trigger

Existing user templates can be edited by invoking the Template Editor from the Trace configuration dialog.

5.1.3Using a Template Trigger

A template trigger is created similar to a regular trigger.

In the Analyzer configuration drop down menu, select Create new configuration.

Enter the name for the new trigger, select desired analysis and choose Template as the Hardware Trigger.

The Templates dialog shows all templates applicable for current CPU configuration.

Two lists are provided:

1.User Templates – templates created by the user, located in the folder specified in Tools/Options/Environment/User Templates

2.Built-in Templates – templates already provided by iSYSTEM. These are located in winIDEA installation folder's Templates subfolder.

Select the desired list

Select the desired template and click OK.

specify the actual values for configurable parameters. For symbolic entries you can use browse in the symbol table by clicking the ... button to the right of the entry field.

This trigger will remain in template form and is discernible from regular triggers by the icon. Additionally the name of the template used is displayed in square brackets after the trigger name.

Converting a Template Trigger to Normal Type

If additional fine tuning of the trigger is required - by accessing the lower level configuration, a template trigger can be converted back to normal Trace type.

To do so:

Right click the trigger in the analyzer configuration dialog

Select the Convert from template command

Confirm the prompt.

Now the trigger can be edited fully.

Disclaimer: iSYSTEM assumes no responsibility for any errors which may appear in this document, reserves the right to change devices or specifications detailed herein at any time without notice, and does not make any commitment to update the information herein.

© iSYSTEM . All rights reserved.