Basically any transfert function that is used as interpolator can also be used as "easing". E.g a quadratic Bézier (let's say with empirically determined coeffs). One lesser known I used to like much is the super ellipse, although very costly.
A problem I see with talking exclusively about lexing is that when you separate lexing from parsing you miss the point that is that a lexer is an iterator consumed by the parser.
Halstead complexity ([0]) might interest you. It's a semi-secret sauce based on the count of operators and operands that tells you if your function is "shitty".
although when working with enumerators, there is a still a risk caused by the fact that re-ordering enumerators or adding new ones can break the switches.
Despite of the drawback I prefer. Also a Range can be a formal expression which simplifies the grammar of other sub-expressions and statements, not only switches but also array slices, tuple slices, foreach, literal bitsets, etc.
this is what the D programming language does. Every var declaration has a well know value, unless it is initialized with void. This is nice, optimizing compilers are able to drop useless assignments anyway.
Compression reduces the range between the lower and the higher level so your ears are faced to a more or less "constant" pressure. Personally when I was into home production I only dared eating the peaks with a limiter and reasonable settings, e.g to gain one or two db, never more.
On top of that another problem with compression is that it is not neutral, bad compressors, especially in the digital domain can introduce aliasing.
Also search for the "loudness war". This is how we called the problem back in the mid 2000's.
Sure, CTFE can be used to generate strings, then later "mixed-in" as source code, but also can be used to execute normal functions and then the result can be stored in a compile-time constant (in D that's the `enum` storage class), for example generating an array using a function literal called at compile-time:
> Make sure to initialize newly allocated memory before it is read.
The FreePascal compiler would warn about the missing init (first example of the "The Heap and Use After Free" section).
That being said I think that uninitialized locals is really something from an other age. Modern languages should always default-initialize locals. Speed of execution is not an issue as optimizing-compilers are able to remove dead assignments.
Delphi does default initialize global variables and member variables of objects. It doesn't default initialize local variables (but it should for consistency).
Inline variable declaration with initialization makes it a bit nicer. You can write:
var myVar := 25;
To initialize myVar and type inference will make it an Integer. You can specify type if needed:
reply