clueless PL theory rant, long 

okay so I've had this very stupid idea bouncing around: what if PLs had *two* semantic models: one would be an "idealized" model where e.g. memory is infinite, all numbers are exact, whatever unrealistic shit you want to say - it doesn't need to be implementable on even a Turing machine

and then you have an "approximation ruleset" that specifies exactly the tradeoffs that are made in order to cope with the limitations of physical machines - so if the idealized model says "all numbers are exact", the approximation ruleset tells you when and how inexact results are generated and how they behave, just for an example

and I find it's interesting to think about because compilers can already optimize code for many things, not just performance (debuggability for one)

and making this kind of split explicit seems like it would open the door for letting compilers optimize for accuracy, such as by e.g. calculating additional digits of pi during constant-folding to avoid roundoff errors


but maybe I'm stupid 

it's entirely possible that somebody else has either done this or proven it infeasible/impossible, in which case feel free to laugh at me

re: PL rant, long 

the Scheme standard (R5RS at least) already gets away with the statement "All objects have unlimited extent, no object is ever destroyed" - and it heavily implies that the idealized semantics would extend this with "memory is infinite"

that applies to all languages with automatic memory management - that's basically the point, at least the "infinite memory" part

"Implementation limits" would essentially become part of the approximation ruleset, as would things like floating-point arithmetic (treating inexact approximations as ideal is just wrong)

I want to see a compiler that can optimize for accuracy, and it seems like this kind of split is the clearest way to accomplish that

If a language outright prohibits folding sqrt(2)*sqrt(2) into exactly 2 something is wrong, whether a given compiler is smart enough to actually do that in a given case is completely irrelevant

Compilers keep getting smarter, and I want to see that put in service of something other than mindless speed for a change

re: PL rant, long 

@nonparticular I can see why you wanted me to bump the character limit now

Sign in to participate in the conversation

A Mastodon instance for users who like the study of programming languages, formal semantics, types, or linguistics.