Algebra in this context means a system of rules for deciding that two programs (or functions) that are expressed differently, are actually equivalent in the sense that they map equal inputs to equal outputs. I.e. f(x) and g(x) are equivalent if they return the same value for all x.
The algebra of arithmetic is the most familiar, and it lets us determine that f(x) = 1 + (1 + (1 + x)) is equivalent to g(x) = ((1 + 1) + 1) + x for any integer x. In this case, the relevant algebraic law is associativity.
One abstract way to think about integers is as sets of equivalent arithmetic trees, so "3" is a stand in for the set {(1 + 1) + 1, 1 + (1 + 1)}.
Other kinds of objects that can appear in a program can have similar kinds of equivalences under various operations.
As a less familiar example, take axis aligned bounding boxes. Translation and scaling both map an axis aligned bounding box to another axis aligned bounding box. Intersection maps two bounding boxes to one smaller bounding box. "Convex closure" maps two bounding boxes to a larger bounding box--specifically, the smallest bounding box that encloses both of them.
There are then various algebraic laws that let us determine that differently expressed combinations of these operations are in fact equivalent.
If "∧" represents intersection, and A, B, and C are bounding boxes, then f(A, B, C) = (A ∧ B) ∧ C and g(A, B, C) = A ∧ (B ∧ C) are equivalent because intersection is associative.
If we apply the same translation to two bounding boxes and then form the intersection of the results, this is equivalent to intersecting the boxes, and then translating the results.
There are also interesting equivalences that relate intersection and closure.
This kind of broadly construed "algebraic" manipulation can be used to rewrite one program into an equivalent program that executes more efficiently, and this is one way of thinking about what an optimizing compiler is doing when it optimizes a program.
The algebra of arithmetic is the most familiar, and it lets us determine that f(x) = 1 + (1 + (1 + x)) is equivalent to g(x) = ((1 + 1) + 1) + x for any integer x. In this case, the relevant algebraic law is associativity.
One abstract way to think about integers is as sets of equivalent arithmetic trees, so "3" is a stand in for the set {(1 + 1) + 1, 1 + (1 + 1)}.
Other kinds of objects that can appear in a program can have similar kinds of equivalences under various operations.
As a less familiar example, take axis aligned bounding boxes. Translation and scaling both map an axis aligned bounding box to another axis aligned bounding box. Intersection maps two bounding boxes to one smaller bounding box. "Convex closure" maps two bounding boxes to a larger bounding box--specifically, the smallest bounding box that encloses both of them.
There are then various algebraic laws that let us determine that differently expressed combinations of these operations are in fact equivalent.
If "∧" represents intersection, and A, B, and C are bounding boxes, then f(A, B, C) = (A ∧ B) ∧ C and g(A, B, C) = A ∧ (B ∧ C) are equivalent because intersection is associative.
If we apply the same translation to two bounding boxes and then form the intersection of the results, this is equivalent to intersecting the boxes, and then translating the results.
There are also interesting equivalences that relate intersection and closure.
This kind of broadly construed "algebraic" manipulation can be used to rewrite one program into an equivalent program that executes more efficiently, and this is one way of thinking about what an optimizing compiler is doing when it optimizes a program.