If you hear someone say "a word of memory", how much would you think it is?

when I procrastinate, nobody cares
but when a computer procrastinates, people call it "lazy evaluation" and start fawning over it

ohohohoho, I see topology memes in the timeline
but do tell me, if you know
when are we doing bottomology memes

pats are:

never back down 

always back up instead
back up everything

rust, spicy, you should read if you're a crab, educational 

old.reddit.com/r/rust/comments

today Rustaceans discovered mapped objects from Self (and JS), just without the infrencer, the inline caching, methods, the transition tree, clone, parent slots and probably more!

Some genius even seems to think they just discovered 'data oriented design', fucking- WHAT?? LMAO 😹​

old.reddit.com/r/rust/comments

Kinda hoping someone comes a long with 'slotette' with a 'metaslotprotocol' next but that might be asking too much 😹​😹​

#Rust #Rustlang #Lisp

lewd joke, type theory 

We have top types, bottom types and subtypes, when are we going to get domtypes?

(also where do switchtypes fit in)

Tired: CPU microarchitecture
Wired: Hardware-assisted bytecode interpreter

The hierarchy of PL implementations:
Interpret < Interpreter < Interpretest < Compil < Compiler < Compilest

re: arcane low-level crap 

The spec offers APIs for almost everything you could possibly want for driving hardware but you can't take any of it with you to OS-land, and even before that most of it isn't guaranteed to exist

Show thread

arcane low-level crap, long- 

so it turns out that the UEFI spec has sections dedicated to things like:
- booting from HTTP [24.7]
- Using Bluetooth devices [26] and connecting to Wi-Fi [27]
- (Networking generally has 700 pages devoted to it, sections [24] through [31])
- Unicode collation [21.1] and regular expression support [21.2]
- Managing user accounts? [36]

This is just a small sample, the other spec documents cover things like using a shell provided by UEFI itself - which looks (and acts) an awful lot like a souped up DOS prompt! The more things change, eh?

And for all the features they crammed into the spec, once your OS gains control of the system you can't use nearly any of the UEFI APIs and still have to do drivers yourself.
But hey, at least you can get there without having to guess at how much memory you have or dealing with going from Real mode to Protected Mode to Long Mode!

...I may have spent too long reading up on arcane low-level details like this. Not going to stop, though.

stupid UI joke 

The best argument against "modern" menu-centric interfaces is that it means you're spending your life behind (menu) bars

bad joke, again 

Why do legislator form general assemblies? It's 2020, they should be forming general Lisps instead

"PL" is short for "Pirate Librarian", you can't change my mind

re: PL rant, long 

the Scheme standard (R5RS at least) already gets away with the statement "All objects have unlimited extent, no object is ever destroyed" - and it heavily implies that the idealized semantics would extend this with "memory is infinite"

that applies to all languages with automatic memory management - that's basically the point, at least the "infinite memory" part

"Implementation limits" would essentially become part of the approximation ruleset, as would things like floating-point arithmetic (treating inexact approximations as ideal is just wrong)

I want to see a compiler that can optimize for accuracy, and it seems like this kind of split is the clearest way to accomplish that

If a language outright prohibits folding sqrt(2)*sqrt(2) into exactly 2 something is wrong, whether a given compiler is smart enough to actually do that in a given case is completely irrelevant

Compilers keep getting smarter, and I want to see that put in service of something other than mindless speed for a change

Show thread

but maybe I'm stupid 

it's entirely possible that somebody else has either done this or proven it infeasible/impossible, in which case feel free to laugh at me

Show thread

clueless PL theory rant, long 

okay so I've had this very stupid idea bouncing around: what if PLs had *two* semantic models: one would be an "idealized" model where e.g. memory is infinite, all numbers are exact, whatever unrealistic shit you want to say - it doesn't need to be implementable on even a Turing machine

and then you have an "approximation ruleset" that specifies exactly the tradeoffs that are made in order to cope with the limitations of physical machines - so if the idealized model says "all numbers are exact", the approximation ruleset tells you when and how inexact results are generated and how they behave, just for an example

and I find it's interesting to think about because compilers can already optimize code for many things, not just performance (debuggability for one)

and making this kind of split explicit seems like it would open the door for letting compilers optimize for accuracy, such as by e.g. calculating additional digits of pi during constant-folding to avoid roundoff errors

re: valuing notification 

to anyone who ever sees this, consider yourself notified: you are valued, loved and important

Show thread

PL pun 

How come typing notifications are a thing but valuing notifications are not?

another horrible pun 

If a Mastodon user with a sweet tooth gets a really sappy toot on their timeline, do they become a tootlicker?

Show older
types.pl

A Mastodon instance for users who like the study of programming languages, formal semantics, types, or linguistics.