Sure. Albeit this might be my age showing up when using old terms.
Quoting wikipedia: "The microSD removable miniaturized Secure Digital flash memory cards were originally named T-Flash or TF, abbreviations of TransFlash. TransFlash and microSD cards are functionally identical allowing either to operate in devices made for the other."
I feel that these languages represent an era of language design that we're leaving behind. I think the experience in industry is that things like ad-hoc type coercions, which are sold as being simple and intuitive, are anything but that. The "Wat Javascript" talk is a great example.
Yeah, I've never understood how people can find dynamic typing "simple and easy to use", types are wonderful for catching errors and documenting the code and makes refactoring much easier - I even avoid auto/var/type interference as much as possible because I like seeing what types things are.
Rather than avoid type inference in c++ by removing auto, I typically specify a type on the RHS in C++. I mostly do this because it isn't that unusual for the inferred type to be something I don't want, but still kind of works. Eigen's matrix evaluator types as an example.
I haven't bumped into this with rust, but I don't see any reason it couldn't be the case, it may just be luck or the culture. I don't specify types in rust unless cargo check tells me to, and vscode with rust analyzer happily annotates all the types if I want it to.
So far there seems to have been a pretty consistent ebb and flow. That approach is deeply out of favor right now but I wouldn't put money on it being permanently gone.
Does anyone know which/any of the microlanguages (Wren, Janet, Gravity, Nelua, etc) have seen larger deployments. Lua has a large number of use cases, but it feels like it is tough to dethrone.
You don't have to do it this way, but it looks like the code to be interpreted is normally compiled to bytecode stored in a constant in a C header. So it's all in flash anyway, and you'd update the flash the same way as changing C code.
If you're going to do that, why not generate C and skip the interpreter? Is it because bytecode is more compact than native code?
I guess there would still need to be a runtime to handle array memory garbage collection.
i think this is just because a loader isn't included and that's the minimum example. in a real project you'd write your own code to load the compiled bytecode from somewhere else (external storage, or fetch from server, or w/e).
i.e., here's an interpreter you can bolt onto your firmware codebase. it has no dependencies and makes no assumptions.
"structs are functions that preserve the stack frame" is a cute idea/perspective. It doesn't look like this view leads to anything distinguishable from the usual approach with constructors though. Any examples that might point out things you can do that you can't do with usual struct+constructor?
[0]: https://jcp.org/en/jsr/detail?id=274 [1]: http://beanshell.org/