2025-06-09

1. Computer optimization Techniques Applied to You! Yes, You!

1.1. Immediate Disclaimer

I am a programmer. I do not know and do not really care for psychology or the brain's physiology. These are just some fun observations I've made about myself. I assume that they also apply to others, but they may not. Now, if anything is wrong about computers, then that's on me. Sorry.

I also do not want to imply that a human brain and a computer are the same or even similar (because they probably aren't, I don't know), but I do believe that a lot of these software optimization techniques are just more specialized versions of universal methods. For example, O(1) or O(log n) in Big O Notation just suggests that we will need to do less work. Caching is just making things you need faster to get.

1.2. Caching

My coffee cup is heavy, so what I do is I just simply go to the kitchen for every sip. Well, in computer science, we usually refer to this as "[JavaScript](https://www.kaspersky.com/blog/apple-cpu-encryption-vulnerability/50869/)". Anyways, caching can occur in two separate ways. One is as I just alluded to, simply bringing information closer to the thing that uses it. These are the processor's and graphics card's caches. The second method is precomputing something and storing it in memory. For example, text is often cached in graphical applications because drawing each character of a paragraph takes a little while, we can just draw the characters once to an image and then just draw that image however many times a second we want. Websites are also often cached (downloaded locally to files). This is why I can go to https://monkeytype.com without internet. The first type of caching is much more of a concern for the programmer. For example, if you are drawing a filled rectangle with a CPU, you do not: for(x in 0..width) for(y in 0..height) {...}, instead, do: for(y in 0..height) for(x in 0..width) {...}. Or, like, just do not use linked lists... (exaggeration, kind of)

Now, the obvious analogy here is short-term memory. After submitting a geography test, it is common courtesy to completely clear out the system cache and finally forget which way is the North Sea. However, I would also argue that muscle memory is caching too. And, while short-term memory is cool (I couldn't make anything complex without it), muscle memory is an absolutely game-changing, overpowered life hack that we are all just used to. Now, while computer cache is mainly "spatial" and "temporal" i.e., based on space and time, we seem to still like familiar things, but introduce our own rating for what should stay in the cache. This is mainly based on repetition count, but also, seemingly, random chance as well as how important we hold it and our general state at each repetition. So, our muscle memory works not only like memory, but also like muscle...

1.3. Branching

Branches occur whenever the computer has to make a decision. For example, an if statement creates 1 extra branch, an if-else creates 2 extra branches, and so on. Now, this is almost unnoticeable when you have, for example, a for loop: for(int i = 0; i < 1000000; i ++);. Like, if compiler optimization is off, there are a million branches there! And yet everything is perfectly fine, because the branch predictor (most likely), generally, takes into account the amount of times branches were taken in the past and tries to predict whether a branch will be taken this time based on these statistics. So, I imagine, the processor will correctly predict the future 999999 to 999998 times and fail once or twice.

Let's say you are like me and want to replace a pair of parentheses with commas, so "(a mistake)" with ", a mistake,". Well, okay, you first need to get to the first parentheses. In Microsoft Word you have 2 choices: mouse or 🠲 🠲 🠲 🠲 🠲 🠲 🠲 🠲 🠲 🠲... Maybe other people less so but, I personally, find the mouse to be tiresome. Because you don't just flick to the parenthesis like the skull-poppin' gun-spinnin' knife-pullin' juan-deagin' speed demon Counter Strike legend you will never be, instead you will quickly move your cursor close to the target and then make 3 to 5 microadjustments. Or, alternatively, there are also the arrow keys. Where you either need to hit (or hold) the key (or with ctrl) until you reach the objective. With Vim, there are a couple solutions to this case. There are the usual: t( and f(, which jump to 1 character before ( and right at ( respectively. And there is also the specialized: %, which, when pressed in normal mode, cycles between all sorts of parentheses OR jumps to the first of the upcoming: )/]/}/>. There is no decision here. I want to jump to parentheses, so I press f(. Then to delete, I just s, <Esc>, which in Word would be Delete, ,, Space, (or, God forbid: mouse select, Backspace, ...). And the for the closing parenthesis, in both Vim and Word, it's the same: f). (dot repeats the last non-moving command) and yada yada yada. Now, I also want to mention why "🠲 🠲 🠲 🠲 🠲 🠲 🠲 🠲 🠲 🠲" is problematic. If you remember, earlier I said that the for loop with 10,000 branches is perfectly fine because they are very predictable. Arrows are the same... Except, no one is "🠲 🠲 🠲" for 10,000 characters at a time. It is more like 10, and so you fail the prediction 10 to 20 percent of the time. What I wanted to highlight with this comparison is that micro decisions can also be slow and tiresome. Instead, you can try to leave as much as possible to your muscle memory.

1.4. Multithreading

Multithreading allows the computer to calculate things multiple times at once. For some problems, like playing 16 instances of old Minecraft at the same time, a CPU with 16 hardware threads is 16 times faster than one that has 15 threads idle. For other problems, mainly those with a lot of data dependency (where the first result is used to calculate the second) and those with imposed limitations (loading an SDF font in Raylib ಥ_ಥ ), multithreading can quickly become slower (because synchronizing just makes the computer do more work, context switching murders my precious cache. Swaps stacks and uses more memory), and also, it is much harder to implement. Okay, one last thing on the computer side. I mentioned hardware threads. Why? Well, there are also software threads. Hardware threads are the real "for each thread you get an extra CPU second per second," while software threads are made up and imagined by the operating system. Software threads can still make code execution faster, but only when there is idling, for example, waiting for an HTTP response. And so, yes, you can make your 2001 thread thread pool for each client of your server, but don't.

Now then... People... Umm... We are terrible at doing multiple things at once, right? Well, kind of. I mean, I am currently: breathing, maintaining my shrimp posture, enjoying The Strokes, typing, thinking of this sentence and casting curses upon a mosquito. That's like 6 tasks that I am doing concurrently. But okay, I cannot spin my arm and foot in opposite directions. And, you know..? The concept of "distraction"... So, what's different here? Actually, for the spinning, I could, probably, learn to do it relatively quickly. I can spin my hands in opposite directions right now and I can rush B in Counter Strike, which is where your left and right hands are not even in the same dimension... But actions, like: typing, sitting and articulating are muscle memory for me, music is, sometimes, distracting, but mostly it just gives me good feels and grooves. And so, I propose that we can multitask things controlled by our intuition well enough (except we cannot look at 2 things at once, or hear many unrelated, unexpected sources), but honestly, we cannot apply our logical thinking for more than one action at a time. This, I guess, may also be where distraction comes from, like driving and talking (not chatting) over the telephone; you're using your logical thinking for talking and do not even get the chance to catch yourself making a mistake driving (although, I don't know how to drive, but still).

1.5. Batching

Batching is combining multiple items into a single unit and then doing something with them all at once. And it's a little bit important. I'm sure you could imagine why sending an Internet packet to a client, then waiting on the response, and only then sending another packet to another client would be slower than sending all packets to all clients straight away. Apart from that, batching is also used in games programming via draw call batching and data-oriented design, which, unlike practically just object-oriented programming, uses batching to transform large blocks of data in one go. This then also allows us to use the micro version of data batching: SIMD (Single Instruction (processes) Multiple Data), which I won't get into, it feels, literally everyone has explained it in a YouTube video already. But one of the reasons why it works is just that the computer takes time to discern that you are asking it to "add" numbers, for example.

For humans, well... Oh yeah, I should try batch cooking at some point! Huh... I mean... We batch our trips out to town, like: shopping straight after work. I can also do a bit of reflection here. The way I write blogs is: 1. I think of a sentence to write, 2. I write half of the sentence, 3. I realize that I am not sure of something, 4. I start searching the web, 5. I find the thing, but by then I have already forgotten what I wanted to write, 6. I am not sure about a comma or spelling of "discern" or something and I again get distracted... It would make so much more sense for me to write in stages. 1. First draft: I just write what I know, 2. I look over the facts and rewrite poorly phrased parts, 3. I check my grammar and spelling, 4. I recheck everything to regain some of the fine-grain improvements my first method gives. But I can safely assume everyone else already does this, and I am the only one who is able to find and read the story of King Solomon in-between writing this and the last sentence.

1.6. Compression

Data compression is changing information in such a way where it is possible to recover what was originally there, but the result uses less space. So, for example, we often say, "There are 15 red balls, 7 blue rectangles and 3 green triangles." (I mean, I literally wrote it like 8 words ago) instead of saying:

"There are: red ball, red ball, red ball, red ball, red ball, red ball, red ball, red ball, red ball, red ball, red ball, red ball, red ball, red ball, red ball, blue rectangle, blue rectangle, blue rectangle, blue rectangle, blue rectangle, blue rectangle, blue rectangle, green triangle, green triangle and a green triangle"

And even then there is compression in that, because what is red? And what is a ball? Or a rectangle? And where are they?

In computers, compression is not something that we usually do at runtime, it is much more of a file storage and transfer technique. For example, my favorite file format: JPEG-XL (better, newer JPEG) is making one of my wallpapers 40 times smaller (versus uncompressed) (at 70% quality, which is just above the official recommended range; I don't mind the artifacting at 30% quality, personally). Videos are even more fun in this regard. The 7 second FR E SH A VOCA DO video can go from half a megabyte to half a gigabyte without compression...

Meanwhile, as I previously pointed out, we use a lot of compression to express what we think, see and know to other people. We also use compression to increase our memory. One memory technique that, I believe, uses data compression is called chunking. From my understanding, it works kind of like LZ (dictionary-based compression), so you replace/associate small chunks of data with another thing that you already know. For example: You think of a number, like 19911102 as 1991 November 2nd or, better yet, the day Vim was released. In that example, at first, you need to remember 8 digits and their order. With some chunking, you only need to remember 5 to 6 things ("19" is VERY common in dates). And, well, you should already know when Vim was birthed, so you only need to remember 1 thing in that case.

1.7. Fun Extras

  1. Raymond Chen's article "Does Windows have a limit of 2000 threads per process?"
    https://devblogs.microsoft.com/oldnewthing/20050729-14/?p=34773
  2. I love that the wikipedia article on Chunking (psychology) is written on the dot like a computer science article. Like between Chunking and Encapsulation, it's genuinely difficult to pick which is computer science and which is psychology.
    https://en.wikipedia.org/wiki/Chunking_(psychology)
    https://en.wikipedia.org/wiki/Memoization
  3. Wikipedia article on King Solomon, consisely written, but also cool story, smart gambit.
    https://en.wikipedia.org/wiki/Judgement_of_Solomon
  4. Very in-depth, but I have actually struggled to find any information about branch prediction. And this is a nice explanation.
    https://comparch.net/2013/06/30/why-tage-is-the-best