1. Directory-tree gradle-conform

Adequate to the guidelines for arrangement of files in gradle, a directory tree has the following structure:

+-gradle.build   ... build rules
+-.gitRepository ... contains path to the .git directory
  +-main      ... the sources of the module, without test
  |  +-cpp    ... they are C++ and C sources
  |     +-src_emC
  |        +-.git
  |        +-.gitRepository
  +-test    ... all for test
  |  +-cpp    ... they are C++ and C sources
  |  |  +-emC_TestXYZ ... sources for test of special topics
  |  |  +-emC_TestAll
  |  |     +-testmain.cpp  ... test organisation for test of all
  |  |
  |  +-VS15  ... Some test projects in Microsoft visual studio 15
  |  +-EclCDT .. some Test projects in Eclipse
  |  +-TI    ... Test on embedded platform (Texas Instruments CCS)
  |  +-QT    ... maybe Test projects in QT developer ...or other tools
  |  +-ZmakeGcc
  |     +-All_Tests   ... Test scripts in jzTxtCmd for whole test with gcc
  |        +-ZmakeGcc.jzTc.sh
  +-docs       ... place for documentation
    +-asciidoc    ... in Asciidoc

That are all sources, able to commit and compare with git. They are two git repositories present:

  • One for this whole tree, including all test things, but not the module sources.

  • A second located in Test_emC/src/main/cpp/src_emC: This is the source repository for the emC, able to include in user applications.

To get this repositories from git hub, use

git clone https://github.com/JzHartmut/Test_emC.git

This gets the Test sources only. But you can start on windows:


or on Linux set executable rights and then start:


to clone the source tree src_emC, load some more tools, compile and test.

2. Infrastructure on the PC for Test_emC

The Test can be run under Windows or Linux.

2.1. Windows arrangements

On Windows MinGW or adequate should be installed to support sh.exe for unix shell scripts and to offer the Gnu Compiler suite (gcc).

Java as version JRE8 should be available. java as command line should be invoke JRE8. If another version of java is installed as standard, the PATH of the system can be changed temporary or all scripts should be invoked with a locally changed PATH environment setting.

On a Windows PC I have installed an ordinary git:

c:\Program Files\git
  <DIR>          bin
  <DIR>          cmd
  <DIR>          dev
  <DIR>          etc
  <DIR>          mingw64
  <DIR>          usr
         152.112 git-bash.exe
         151.600 git-cmd.exe
          18.765 LICENSE.txt
         160.771 ReleaseNotes.html

And MinGW for compilation:

<DIR>          bin
<DIR>          include
<DIR>          lib
<DIR>          libexec
<DIR>          mingw32
<DIR>          msys
<DIR>          share
<DIR>          var
<DIR>          _dll
<DIR>          _docu

The folder _dll contains

2016-12-11  23:44           115.214 libgcc_s_dw2-1.dll
2016-12-11  23:44         1.483.790 libstdc++-6.dll

which are copied from the c:\Programs\MinGW\bin\ directory. This path c:\Programs\MinGW\_dll is in included in the systems PATH variable. It is necessary to immediately execute *.exe-files which are compiled with MinGW. This both dll are required to execute. The other possibility may be, include c:\Programs\MinGW\bin\ instead in the PATH.

I have written a batch file which is associated to the extension .sh named unix_script.bat :

@echo off
set PATH=c:\Programs\MinGW\bin;c:\Programs\MinGW\msys\1.0\bin\; ...
   ... C:\Program Files\git\bin;%PATH%
set HOMEPATH=\vishia\HOME
REM -x to output the command as they are executed.
echo on
sh.exe -c %SCRIPTPATH%

Note that …​ …​ is one line. With them a shell script can be executed immediately with double-click, inclusively git commands and mingw execution. The local systems PATH extension includes the git and MinGW executables. The line


converts the backslash (given on double click in calling argument) to the necessary slash. The HOMEPATH and HOMEDRIVE variables sets the home directory which is known in Unix/Linux. So you can execute Unix/linux shell scripts nearly usual as in the originals. aption of the operation system access to Windows). Instead copying the dll you can also include the c:\Programs\MinGW\bin in the systems PATH, but in my mind it is better to exactly know which dlls are required.

2.2. Sense or nonsense of local PATH enhancements

You can enhance the PATH locally, that is the strategy using -setEnv.bat inside the generation scripts for Windows. Note: The enhancement of a script variable in a called script does not work for Unix/linux, but it does work for Windows. That approach is known by all experts.

The other possibility is: On installation process on a special tool the installer enhances the systems settings. Then the tool runs without any scripting. This is the common way for ordinary installations.

Setting a special path into the PATH on script has the advantage for more experience. You will see what is really necessary. You can choose between different toos and versions which uses the same command names (sh.exe, gcc.exe etc.)

3. Test strategies: individual and complete tests, documentation

The test of modules (units) has three aspects:

  • a) The nightly build test to assure, all is correct. Avoid bugs while improvement.

  • b) The manual step by step test to see what is done in detail, the typical developer test.

  • c) Tests document the usage.

The point a) is the primary for continuous integration. The point b) is the most self-evident for the developer, one should use firstly this test aspect by himself. The point c) is the most important for a user of the sources. One can see how does it works by means of the test (-examples).

3.1. Individual Tests

There are some IDE project files:

  • src/test/VS15/All_Test/AllTest_emC_Base.sln: Visual studio

  • src/test/EclCDT/emC_Test/.cproject: Eclipse CDT

  • TODO maybe QT

Offering special test projects for various topics has not proven successful, because the maintenance of some more projects is a too high effort. Instead, there is exactly one project for any platform (it means two, one for Visual Studio and one for Eclipse CDT), no more. To test a special topic there is a main routine which’s calling statements are commented, only the interesting call is used, for single step in debug. This is simple to make.

#ifdef DEF_MAIN_emC_TestAll_testSpecialMain
int main(int nArgs, char const*const* cmdArgs )

This is a snapshot of the current situation. This main routine is used for both IDE.

The include path is IDE- and configuration-specific in the IDE. For both IDEs different path are used for the

#include <applstdef_emC.h>

This file(s) should be changed for several Variants for emC compilation. Of course any commit contains the last used situation, not a developer progress in any case.

The applstdef are located in applstdef_Location_VStudio

         1.651 AllTest_emC_Base.sln
<DIR>          applstdef_C1
<DIR>          applstdef_CppObj

It is for Visual Studio. The same set of files, but other files are existing for Eclipse-CDT, see project.

3.2. What is tested? C-sources, compiler optimization effects and in target hardware

Firstly the algorithm of the C-sources should be tested. It should be independent of the used compiler and there options. Hence any compiler can be used for test of the sources, for example a Visual Studio compiler, gcc or other.

Secondly, it is possible that an algorithm works proper with the selected compiler, but fails in practice on an embedded hardware. What is faulty? It is possible that the target compiler has better optimization, and a property keyword such as volatile is missing in the source. It is a real failure in the source, but it was not detected by the test run with lower optimization.

In conclusion of that, the compiler and its optimization level should be carefully set. The test should be done with more as one compiler and with different optimization levels. For nightly tests the night may long enough.

The next question is: "Test in the real target hardware". An important answer is: "The test should not only be done in the special hardware environment, the sources should be tested in different environment situations". For example, an algorithm works properly in the special hardware environment because some data are shortened, but the algorithm is really faulty. Ergo, test it in different situations.

But the test in the real target environment, with the target compiler, running inside the real hardware platform may be the last of all tests. It can be done as integration test of course, but the modules can be tested in this manner too.

It means, the test should compile for the target platform, load the result into the target hardware, run there, get error messages for example via a serial output, but run it as module test. Because not all modules may be able to load in one binary (it would be too large), the build and test process should divide the all modules in proper portions and test one after another, or test parallel in more as one hardware board.

4. Test of all for algorithm test with gcc with some variants

Because the test should run on PC the gcc compiler is favored for the common test_all. This common test is described in Test environment for Test_emC from git archive as how-to-documentation.

4.1. Generate a script with compile and commands

The compiler is invoked as command in a script. The script contains the immediately real compiler invocation. It is not a make script which builds the compiler invocation internally using some dependencies, settings etc. The advantage of immediately real compiler invocation is: It is immediately documented what is happen.

To generate this compiler invocation script a https://vishia.org/JZtxtcmd/html/JZtxtcmd.html script is used:

             207 +buildLoop.bat
             355 +cleanbuild.bat
           1.548 applstdef_emC.h
          17.656 ZmakeGcc.jzTc.sh  <<===

This script is similar a make script, it contains the information what to make. The script defines a text translation, not a make. The output of the translation are some shell scripts which invokes compiling, linking and executing for different test conditions. This output files are written to

  <DIR>          result
  <DIR>          dbgBhClassJcFull
  <DIR>          dbgBheap
  <DIR>          dbgBhSimple
          52.640 make_dbgBhClassJcFull.sh <<===
          50.299 make_dbgBheap.sh         <<===
          53.040 make_dbgBhSimple.sh      <<===

It is a snapshot with three test files. To produce it, the ZmakeGcc.jzTc.sh starts the following statements:

#REM: invoked either from root of Test_emC or from current dir,
#REM but should work from point root of Test_emC
if test -f ZmakeGcc.jzTc.sh; then cd ../../../..; fi
java -jar libs/vishiaBase.jar src/test/ZmakeGcc/All_Test/ZmakeGcc.jzTc.sh
##Execute the even yet generated sh scripts, compile and execute:
exit 0  ##the rest of the file is the JZtxtcmd script

They are shell script statements, which invokes the JZtxtcmd as main class of vishiaBase.jar with this file. After them the generated file makeAll.sh is executed. It looks like:


It invokes all of the build files. The detailed build files looks like (shortend)

# call of compile, link and execute for Test emC_Base with gcc
if ! test -d build/result; then mkdir build/result; fi
rm -f build/dbgBhSimple/gcc*.txt
echo dbgBhSimple: Compile with -D DEF_ObjectJc_SIMPLE -D .....
echo ==== g++ emC/Base/Assert_emC.c 1>> build/dbgBhRefl/gcc_err.txt
if ! test -e build/dbgBhRefl/emC/Base/Assert_emC.o; then
  mkdir -p build/dbgBhRefl/emC/Base
  g++ -O0 -Wall -c -Wa,-adhln -D DEF_ObjectJc_REFLREF ....
  if test ! -e build/dbgBhRefl/emC/Base/Assert_emC.o; then
    echo c++ ERROR: emC/Base/Assert_emC.c
    echo ERROR: emC/Base/Assert_emC.c >> gcc_nocc.txt;
    echo c++ ok: emC/Base/Assert_emC.c
  echo exist: emC/Base/Assert_emC.c
echo ==== execute the test ====
build/dbgBhSimple/emCBase_.test.exe 1> build/result/dbgBhSimple.out
echo ==== Test cases ==========
cat build/result/dbgBhSimple.out
echo ==== Test failures =======
cat build/result/dbgBhSimple.err
echo ==========================

The compile command line is shortened here, see the originally script.

With echo and cat a proper console output is produced while the test runs. The result can be checked and compared with the previous or reference result in ref via the produced files. Compiler and linker errors are written to files, so the problems can be detect, of course with the helpness of the IDE which can be configured to the test variant.

4.2. Content of the ZmakeGcc.jzTc.sh for test cases

The ZmakeGcc.jzTc.sh continues with:

Openfile makeAll = "build/makeAll.sh"; ##global access for all build_...
main() {
  call test_emC();
##Compilation, Link and Test routine called also from the gradle task.
sub test_emC() {
 ##This routine calls all variants of compiling
 call build_DbgBheap(dbgOut="dbgBhSimpleNch", cc_def=cc_defSimpleNch);
 call build_DbgBheap(dbgOut="dbgBhReflNch", cc_def=cc_defReflNch);
 call build_DbgBheap(dbgOut="dbgBhSimple", cc_def=cc_defSimple);

It names and invokes generation for some variants. The cc_def variable is a

String cc_defSimpleNch = "-D DEF_ObjectJc_SIMPLE -D ...";

There is a variable for each variant. It contains compiler arguments, especially the definition for the variants.

The subroutine defines which files are used:

##Compilation, Link and Test routine called also from the gradle task.
sub build_DbgBheap(String dbgOut, String cc_def) {

After <+makesh> that are text generation parts.

<+out>Generates a file build/make_... <.+n>
Obj checkDeps = new org.vishia.checkDeps_C.CheckDependencyFile(console, 1);
checkDeps.readCfgData("src/test/ZmakeGcc/All_Test/cfgCheckDeps.cfg", File: <:><&currdir><.>);
<+out><:n>checkDeps_C: build/<&dbgOut>/deps.txt read successfully<.+n>
String sMake = <:><&currdir>/build/make_<&dbgOut>.sh<.>;
Openfile makesh = sMake;
<+makesh># call of compile, link and exe...<:n><.+>

The following zmake call compilation and linking. It names the used files in named sets:

zmake <:>build/<&dbgOut>/*.o<.> := cppCompile( &c_src_emC_core
, &c_src, &src_Base_emC_BlockHeap
, &src_Base_emC_NumericSimple, &src_OSALgcc
, &srcTest_ObjectJc
, &srcTest_Exception
, &srcTestStmEv
, &srcTestBlockHeap
,cc_def = cc_def, makesh = makesh
zmake <:>build/<&dbgOut>/*.o<.> := cppCompile(&srcTestMain_All
,cc_def = <:><&cc_def> -D DEF_TESTALL_emC <.>, makesh = makesh
//This is the comprehensive test project.
zmake <:>build/<&dbgOut>/emCBase_.test.exe<.> := ccLink(&c_src_emC_core
, &c_src, &src_Base_emC_BlockHeap
, &src_Base_emC_NumericSimple, &src_OSALgcc
, &srcTest_ObjectJc
, &srcTest_Exception
, &srcTestStmEv
, &srcTestBlockHeap
, &srcTestMain_All
, makesh = makesh);

The cppCompile and ccLink are sub routines for text generation too for the compiler and linker call.

The file sets are defined as:

Fileset c_src_emC_core =
( src/main/cpp/src_emC:emC/Base/Assert_emC.c
, src/main/cpp/src_emC:emC/Base/MemC_emC.c
, src/main/cpp/src_emC:emC/Base/StringBase_emC.c
, src/main/cpp/src_emC:emC/Base/Object_emC.c
, src/main/cpp/src_emC:emC/Base/ObjectJcpp_emC.cpp
, src/main/cpp/src_emC:emC/Base/Exception_emC.c
, src/main/cpp/src_emC:emC/Base/ExceptionCpp_emC.cpp
, src/main/cpp/src_emC:emC_srcApplSpec/applConv/ThreadContextUserBuffer_emC.c
, src/main/cpp/src_emC:emC_srcApplSpec/applConv/ExceptionPrintStacktrace_emC.c
, src/main/cpp/src_emC:emC/Test/testAssert_C.c
, src/main/cpp/src_emC:emC/Test/testAssert.cpp

The file sets are tailored for blocks of dependencies.

You can write your own file sets and checks with dependencies. It is simple to comment some lines. The linking says whether all was found.

The include path may build from a fileset too, but is given here as simple

String inclPath =  ##from position of the generated make.cmd file
<:>-Isrc/test/ZmakeGcc/All_Test <: >
-Isrc/main/cpp/src_emC/emC_inclComplSpec/cc_Gcc <: >
-Isrc/test/cpp <: >

You can experience with the JZtxtcmd generation by yourself, also use another compiler, convenient with a copied script.

4.3. Distinction of several variants of compilation

The distinction between C and C++ compilation can be done using either gcc for *.c-Files or g++ which always compiles as C++. This is the content of the special build_…​ routine. Some more build_…​ routines are existing for different used files and for decision between C and C++ compilation.

The distinction between conditional compilation (variants, see ../Base/Variants_emC.html are done with the different content of the cc_def variable. It contains '-D …​' arguments for the compilation. The other variant may be selecting different <applstdef_emC.h> files which is recommended for user applications. Then the include path should be varied. It needs some applstdef_emC.h files. This can be done too, the part of the include path to <applstdef_emC.h> is contained in the cc_def variable.

4.4. Check dependency and rebuild capability

A file should be compiled:

  • If the object file does not exist

  • If the source file is newer than the object file

  • If any of the included source files (e.g. header) is newer than the object file

The first two conditions are checked from a ordinary make file. For the third condition (indirect newly) the dependencies between the files should be known. For a classic make files this dependencies can be given - if they are known. In practice the dependencies depends on the include situation, it is not simple. Hence the real dependencies can only detect for a concretely version of the file, and the make script should correct any time. IDEs use their native dependency check.

Because this cannot be done easily, often there is a 'build all' mentality.

For repeated compilation the 'build all' mentality needs to much time.

For this approach a Java class org.vishia.checkDeps_C.CheckDependencyFile respectively some more files in that package are used. This tool uses a comprehensive file deps.txt which contains the dependency situation of each file. The tool checks the time stamp of all depending files from the list. If one file is newer, it is parsed by content, find out include statements and build newly the dependencies from this level. On the one hand of course the object should be recompiled, because another content may be changed. On the other hand the dependencies for the test later are corrected.

Because the dependency file contains the time stamp of any source file, it is detected whether an older file is given. The comparison of time stamps is not the comparison between source and object, it is the comparison between the last used source and the current source time stamp. The newly compilation is done also if the file is older, not only newer than the object file. This is a expectable situation, if a file is changed by checkout from a file repositiory with its originally time stamp (the older one). Because git and some other Unix/linux tools stores an older file with the current timestamp this problem is not present on linux, but Windows restores or preserves the time stamp of a copied file, which may be the better and here supported approach.

If the dependency file is not existing, it means, the dependencies should be detected, build all is necessary and the dependency file is built. This is the situation on first invocation after clean.

The dependency file is stored inside the object directory:

 <DIR>          emC
 <DIR>          emC_srcApplSpec
 <DIR>          emC_srcOSALspec
 <DIR>          emC_TestAll
 <DIR>          emC_Test_C_Cpp
        202.969 emCBase_.test.exe
         14.488 gcc_err.txt
              0 ld_err.txt
              0 ld_out.txt
        220.557 deps.txt

It is a snapshot from the root of the object dir tree. The deps.txt has about 220 kByte, it is not too long. You can view this file to explore the individual dependencies of each file, which may be informative.

The dependency check is part of each build sub routine for one exe:

sub build_DbgBheap(String dbgOut, String cc_def) {
 <+out>Generates a file build/make_test_emC.sh for compilation and start test ... <.+n>
Obj checkDeps = new org.vishia.checkDeps_C.CheckDependencyFile(console, 1);
checkDeps.readCfgData("src/test/ZmakeGcc/All_Test/cfgCheckDeps.cfg", File: <:><&currdir><.>);
<+out><:n>checkDeps_C: build/<&dbgOut>/deps.txt read successfully<.+n>

The subroutine knows a `checkDeps instance which is initialized with the given dependencies (may be file not found).

In any compilation invocation the dependency of the source file is checked:

sub cppCompile ( Obj target:org.vishia.cmd.ZmakeTarget, String cc_def...
 for(c_src1: target.allInputFilesExpanded()) {
   ##The checkDeps algorithm itself may be ...
   ##but it creates the obj directory tree which is necessary for compilation.
   ##The checkDeps checks whether the file is changed, delete the obj file
   Obj infoDeps = checkDeps.processSrcfile(File: &c_src1.file(),

The check of the unchanged situation does only need reading the time stamps of all depending files, it is very fast because the file system is usual cached. If dependencies should be evaluate newly all source files are parsed. Of course already parsed included files are not proceed twice. The parsing, and checking for # include statement, does only need a short time because Java is fast. The gcc compiler itself supports a dependency check too, but that is very slower (not because C++ is slow, but because it may be more complex. The checkDeps dependency check is more simple, for example it does not regard conditional compilation (a conditional include). It means, it detects a dependency to a included file which is not active in the compiling situation. But that is not a disadvantage, because the dependency can be exist, and the unnecessary compilation because of one conditional include does not need more time than the elaborately dependency check.

If the object file should be recompiled, the checkDeps algorithm deletes it and forces a recompilation because existency check of the object file before compilation. It is a simple liaison between this independent tools.

4.5. Invocation variants of the ZmakeGcc.jzTc.sh

This script is called in the build.bat or build.sh from the root of this working tree as well as from a gradle script. The environment settings, especially the 'build' sub directory should be existing and clean for a new test. The cleaning can be done either by manual deleting the 'build' directory (it should be a symbolic link repectively a Junction on Windows), or by invocation of clean.bat or clean.sh.

Calling +cleanbuild.bat invokes clean.bat and build.bat, hence it executes a new build with 'clean all' strategy.

Calling +buildLoop.bat assumes a +cleanbuild.bat before, and executes the 'ZmakeGcc.jzTc.sh' in a loop after pause. It is for error correction if any file does not compile or a file has changes in source. The source editing can be done in an IDE maybe with particulary test. Repeating the 'ZmakeGcc.jzTc.sh' is a simple operation to repeat the test over all.

A simple invocation of ZmakeGcc.jzTc.sh does the same because the used start script for the shell script sets the necessary system PATH to the MinGW compilation tools.

4.6. View of test results

The execution of the compiled build/test_case/*.exe writes its result to a file in build/result/test_case.out. Check its timestamp and compare it with the stored reference results in ref/test_case.out.

The sources uses the chapter: Test check and results approach. Hence it writes:

Test: Name of the test (testfile @line)
  ok: Description of detail test
  ERROR: Description of detail test (testfile @line)

for each test routine. If an ERROR: was written, then refer the line and repeat the test using single step debugging on the IDE with the given variant settings (adjust <applstdef_emC.h>

Addtional an output text can be written, for example testing the exception handling:

Test: test_Exception: (emC_Test_Stacktrc_Exc/TestException.cpp @ 95) ...
 ok: TRY without THROW with FINALLY is ok
 ok: File hint found in Exception
 ok: Exceptiontext: faulty index:10 for value 2.000000(10, 0) in: src/test/cpp/emC_Test_Stacktrc_Exc/TestException.cp4
Exceptiontext: faulty index:10 for value 2.000000(10, 0) in: src/te....
IndexOutOfBoundsException: faulty index:10 for value 2.000000: 10=0x0000000A
 at THROW (src/test/cpp/emC_Test_Stacktrc_Exc/TestException.cpp:41)
 at testThrow (src/test/cpp/emC_Test_Stacktrc_Exc/TestException.cpp:34)
 at test_Exception (src/test/cpp/emC_Test_Stacktrc_Exc/TestException.cpp:118)
 at main (src/test/cpp/emC_TestAll/testmain.cpp:75)
 ok: simple THROW is catched.
 ok: TRY without THROW after an Exception before has not entered CATCH BLOCK

In this case the programmed console output of the exception message and stack trace is shown. The distinction between Test outputs and programmed outputs is Test:, ` ok:` and ` ERROR:` on start of line, see examples above.

5. Test check and output in the test files

The tests should work silent for nightly tests if they don’t fail. It should be possible to output some information, one line per test, what is tested.

Test results are checked with macros

EXPECT_TRUE(condition) << "additional test information";

etc., the same macros as used for Google-Tests are used, but the whole google test framework itself is not used here. The EXPECT…​-Macros are defined in the following kind:

#define EXPECT_TRUE(VAL) \
if(EXPECT_TRUEmsg1(VAL, __FILE__, __LINE__)) std::cerr

The routine EXPECT_TRUEmsg1(…​) returns false if the condition is true, if no message should be output. Hence the if(…​) construct with the following statement starting with std:cerr completed with << "additional text in the users code forces the output only on error.

Only if the test fails, the file and line is reported, after them the user message. With this information the test can be found out simple by the developer.

It is a simple writing style for application of this macro.

The test macros and operations are defined in org/vishia/emC/Test/testAssert.h and ~.c in the emC_Base component, able to use in al emC sources out of test too.

6. Test environment, mock, dependency injection

(additonal content with common meaning, TODO)

The test routines itself calls one or some routines from the module sources in an environment arranged in the respective test routine. If instances are necessary, they are created and removed after test in the test routine. If additional depending complex modules are necessary, they should be replaces by mock objects because elsewhere the other module is tested too in a complex non-independent kind. The mock object should be simple and can contain some helper for checking the test behavior. The possible usage of dependency injection instead instantiating of composite objects inside the test object is a problem of the module source, not a problem of the test itself.