• 0 Posts
  • 12 Comments
Joined 1 year ago
cake
Cake day: July 5th, 2023

help-circle
  • Object pooling is an absolute necessity for performance in modern environments that remove all manual memory management in favour of automatic garbage collection. And it's still good practice to reuse memory when you do have some manual control.

    Not many things will slow your program down (or make a garbage collector blow up) as effectively as alternately freeing and requesting tiny chunks of memory from the OS thousands of times a second.




  • I love low-level stuff and this still took me a little while to break down, so I'd like to share some notes on the author's code snippet that might help someone else.

    The function morse_decode is meant to be called iteratively by another routine, once per morse "character" c (dot, dash, or null) in a stream, while feeding its own output back into it as state. As long as the function returns a negative value, that value represents the next state of the machine, and the morse stream hasn't yet been resolved into an output symbol. When the return value is positive, that represents the decoded letter, and the next call to morse_decode should use a state of 0. If the return value is 0, something has gone wrong with the decoding.

    state is just a negated index into the array t, which is actually two arrays squeezed into one. The first 64 bytes are a binary heap of bytes in the format nnnnnnlr, each corresponding to one node in the morse code trie. l and r are single bits that represent the existence of a left or right child of the current node (i.e. reading a dot or dash in the current state leading to another valid state). nnnnnn is a 6-bit value that, when shifted appropriately and added to 63, becomes an index into the second part of the array, which is a list of UTF-8/ASCII codes for letters and numbers for the final output.



  • I completely agree. And the video didn't discuss how any of that actually happens, except to say that they send the update over radio, and to give a brief description of how the storage system on Voyager works (physically, not logically). That's what I meant by "really nothing here", "here" meaning "in the video", not "in how the Voyager probe works and updates are carried out".

    That next line, "It turns out they they update the software by sending the update by radio," was meant to be a bit sarcastic, but I know that isn't obvious in text, so I've added a signifier.


  • This is a short, interesting video, but there's really nothing here for any competent programmer, even a fresh graduate. It turns out they they update the software by sending the update by radio (/s). The video hardly goes any deeper than that, and also makes a couple of very minor layman-level flubs.

    There is a preservation effort for the old NASA computing hardware from the missions in the 50s and 60s, and you can find videos about it on YouTube. They go into much more detail without requiring much prior knowledge about specific technologies from the period. Here's one I watched recently about the ROM and RAM used in some Apollo missions: https://youtu.be/hckwxq8rnr0?si=EKiLO-ZpQnJa-TQn

    One thing that struck me about the video was how the writers expressed surprise that it was still working and also so adaptable. And my thought was, "Well, yeah, it was designed by people who knew what they were doing, with a good budget, lead by managers whose goal was to make excellent equipment, rather than maximize short-term profits."


  • Some of the things you mentioned seem to belong more properly in the development environment (e.g. code editor), and there are plenty of those that offer all kinds of customization and extensibilty. Some other things are kind of core to the language, and you'd really be better off switching languages than trying to shoehorn something in where it doesn't fit.

    As for the rest, GCC (and most C/C++ compilers) generates intermediate files at each of the steps that you mentioned. You can also have it perform those steps atomically. So, if you wanted to perform some extra processing at any point, you could create your own program to do so by working with those intermediate files, and automate the whole thing with a makefile.

    You could be on to something here, but few people seem to take advantage of the possibilities that already exist, and combining that with the fact that most newer languages/compilers deliberately remove these intermediate steps, this suggests to me that whatever problems this situation causes may have other, existing solutions.

    I don't know much about them myself, but have you read about the LLVM toolchain or compiler-compilers like yacc? If you haven't, it might answer some questions.



  • The real problem, as I think we are all aware, is that copyright lasts for far too long. It should be a carrot used to support and encourage creators, not a stick used by publishers to beat both thieves and well-meaning people indiscriminately in a futile pursuit of unending revenue streams.

    And here are some comments on specific points in the article:

    "ESA and its member companies [...] support efforts by cultural institutions to build physical video game collections."

    "It simply is not accurate that the industry has opposed efforts by libraries to have legal access to games for preservation purposes."

    That's a carefully-worded statement. Yes, it would be fruitless and openly greedy to try to oppose things that libraries have an obvious legal right to do, like buy and preserve physical items. But that doesn't preclude publishers from lobbying lawmakers and presenting testimony to reduce what's legal in the first place, such as using tools and processes needed to keep the games actually playable.

    The ESA says it recognises the importance of libraries, with Mgbojikwe observing that more than 2,500 video games have been donated to the Library of Congress to date.

    While the Library of Congress' so-called Mandatory Deposit doesn't apply to video games (AFAIK), I would wager that most if not all of these games were "donated" to fulfil the mandatory requirement which is a part of the process of formal copyright registration in the USA. While this formal registration itself isn't mandatory, the donations still probably weren't given out of the goodness of their hearts.

    Games like Sid Meier's Covert Action feature ideas and mechanics not seen in modern games, yet remain unavailable.

    Covert Action has been available on GOG for very nearly ten years at the time of writing. I get the point that they're trying to make, but if you're going to use a specific example, pick one that stands up to at least casual scrutiny.


  • I once had problems getting an OpenGL program running under Windows on a 2nd gen i5 laptop, and discovered that the issue was that Intel had actually removed support for anything later than OpenGL 1.1 from the later versions of their drivers for those iGPUs.

    After a little research, it looks like there's a "legacy" version of Mesa (mesa-amber) recommended for iGPUs on anything older than 3rd gen, and also that even the main Mesa drivers don't support Vulkan fully on 3rd and 4th gen Core iX processor iGPUs.

    I think that you may just be out of luck trying to use Vulkan on a 2nd gen Core iX CPU iGPU, but I'd love to hear from someone who knows better.


  • I agree with most of what you said, except for the Windows examples. The pages that you linked begin with three-line TL;DRs that are enough for any barely-competent user to find and modify the necessary settings. While the full instructions may be tortuously detailed, are they actually hard to understand?

    And sure, those Windows pages don't advance the user's knowledge in any meaningful way, but neither does blindly copying and pasting a line of shell commands.

    By the way, while I appreciate that we're talking about if and how CLI is superior to GUI, and not Linux versus Windows...

    Where-as Linux users can easily share commands and fixes or tests over a simple irc chat, because the command line reaches the whole system.

    ... both of those tasks can be done via CLI in Windows, too. I am very happy that I switched to Linux, but there's no reason to misrepresent the other guys.


  • One thing that wasn't mentioned in the article is default settings. In so many CLI programs (and FOSS in general), there seems to be some kind of allergy to default settings. I don't know whether it's a fear of doing the wrong thing, or a failure to sympathize with users who haven't spent the last three months up to their elbows in whatever the program does. But so often I've come to a new program, and if I've managed to stick with it, come to the realization later that at least half of the settings I needed to research for hours and enter manually every time could have been set to static defaults or easily-derived assumptions based on other settings in 99% of cases. Absolutely let your users override any and all defaults, but please use defaults.

    I'd also be interested in the overlap between people saying, "LOL just get gud" about using the command line, and people who are terrified of using C++.


  • And don't forget that git isn't GitHub! You can use git locally and make your own offsite backups, or I believe that there are other git-based online services as well. As a solo dev working on personal projects, I found it much easier simply to ignore the online issue completely, but git on its own is still super useful.