This, and in general these DSLs, kind of makes me think of Kolmogorov complexity. We basically want the user to write the shortest possible program to describe the same algorithm, and let the compiler take care of the rest.
kayvonf
@jkorn. The goal is not necessarily to be short. Rather, I think a better goal is reflect the "natural" way to think about a problem or algorithm. When a program reflects the natural way to think about a problem, the code is easier to understand, debug, etc...
Although its common to cite lines of code as a measure of programming simplicity, it's important to remember that far more code in the world is read and debugged than written. So the optimize-for-the-common case principle says a well-designed language should make algorithms as natural to read and understand as possible, not as easy to write as possible.
The code on this slide certainly fails that test! It's very difficult to understand what the code is doing from the text of the source.
jkorn
Right, that makes sense. I feel like more often than not though, without having to worry about specific implementation details, DSL code ends up being much more succinct than writing the entire program yourself in, say, C.
paramecinm
Sometimes letting compiler do most of the job will make debugging harder. There are many bugs not caused by wrong algorithm but incorrect understanding of how compiler will interpret the code. Also, misunderstanding the compiler's behavior may make analyzing the performance harder.
pdp
Compiler could also be pessimistic with some optimizations, which could have not been the case if the programmer wrote the code in low level primitives. Hence, a good DSL design is important.
This, and in general these DSLs, kind of makes me think of Kolmogorov complexity. We basically want the user to write the shortest possible program to describe the same algorithm, and let the compiler take care of the rest.
@jkorn. The goal is not necessarily to be short. Rather, I think a better goal is reflect the "natural" way to think about a problem or algorithm. When a program reflects the natural way to think about a problem, the code is easier to understand, debug, etc...
Although its common to cite lines of code as a measure of programming simplicity, it's important to remember that far more code in the world is read and debugged than written. So the optimize-for-the-common case principle says a well-designed language should make algorithms as natural to read and understand as possible, not as easy to write as possible.
The code on this slide certainly fails that test! It's very difficult to understand what the code is doing from the text of the source.
Right, that makes sense. I feel like more often than not though, without having to worry about specific implementation details, DSL code ends up being much more succinct than writing the entire program yourself in, say, C.
Sometimes letting compiler do most of the job will make debugging harder. There are many bugs not caused by wrong algorithm but incorrect understanding of how compiler will interpret the code. Also, misunderstanding the compiler's behavior may make analyzing the performance harder.
Compiler could also be pessimistic with some optimizations, which could have not been the case if the programmer wrote the code in low level primitives. Hence, a good DSL design is important.