*Hacker's Delight*, Henry Warren gives a brief proof of a theorem he originally published in 1977, back when Communications of the ACM actually included such things:

A function mapping words to words can be implemented with word-parallel ADD, SUBTRACT, AND, OR, and NOT instructions if and only if each bit of the result depends only on bits at and to the right of each input operand.

For a single input of N bits there are 2^(2^(N+1)-2) such functions, much lower than the (2^N)^(2^N) arbitrary N-bit to N-bit functions.

As an example of a function that cannot be so represented, think about "shift right 3 bits", where bits depend on the value of input bits to the left of their position.

Challenge: can you specify a word-to-word function whose definition satisfies the condition above, but whose implementation (on some real-world architecture) is more efficient using an instruction that does

*not*satisfy the right-dependency condition?

For example, the "shift left by N bits" function can be performed by repeated addition (for constant N), but most architectures include a left-shift operation natively. However, left-shift is dependent only upon bits to the right of the output bit, so it doesn't satisfy the challenge.

## Error

Your reply will be screened

Your IP address will be recorded

You must follow the Privacy Policy and Google Terms of use.