These topics provide a formal definition of C++ lexical elements. They describe the different categories of word-like units (tokens) recognized by a language.
The tokens in a C++ source file are derived from a series of operations performed on your programs by the compiler and its built-in preprocessor.
The preprocessor first scans the program text for special preprocessor directives (see Preprocessor directives for details). For example, the directive #include <inc_file> adds (or includes) the contents of the file inc_file to the program before the compilation phase. The preprocessor also expands any macros found in the program and include files.
A C++ program starts as a sequence of ASCII characters representing the source code, created using a suitable text editor (such as the IDE’s editor). The basic program unit in C++ is a source file (usually designated by a ".c", or ".cpp" in its name), and all of the header files and other source files included with the #include preprocessor directive. Source files are usually designated by a ".c" or ".cpp" in the name, while header files are usually designated with a ".h" or ".hpp".
In the tokenizing phase of compilation, the source code file is parsed (that is, broken down) into tokens and whitespace.
Copyright(C) 2008 CodeGear(TM). All Rights Reserved.
|
What do you think about this topic? Send feedback!
|