"As a consequence, recommended swap space is considered a function of system memory workload,"
That sounds good, but then we want a rule linking swap space to workload, or better, linking RAM size to workload.
The reasoning would be that your swap space should be big enough to handle the maximum possible demand, whereas the amount of memory is linked to your 'working set size'. A reasonable measure is to look at your uptime (run queue length). If that is 20 (long-term average fior a system with infinite RAM) that means your working set is 20 threads and you would also need that number of processor cores. A first approximation would be to figure out what programs those threads might belong to, and the amount of memory they need and install roughly that amount of RAM.
A better approach takes the size of those programs and the task switching frequency into account. Or simper:: just shrink the RAM to the point where the effective throughput and response-time is the required level, for instance, half the performance of a machine with unlimited RAM. Or you just llook for the point where the swap device is busy waiting 15 % of the time on average. With SSDs, you would install rather less RAM than with magnetic discs, because the latency of your swap device is so much lower.
A simple rule of thumb is that you need at least the amount of RAM for swap space in order to store a crash dump.
What is computer literacy anyway? That sounds like having to learn several programming languages and then doing your Ph. D. thesis by studying the life of Bill Gates and reading all the programs that he wrote. Less useful than Shakespeare.
Informatics (Computer Science) is a branch of applied mathematics and programming is a large part of it. I am absolutely for mathematics in secondary education and maths might be more fun if the teacher taught just a little programming on the side.
"And the world frankly needs more programmers."
What the worlds needs is fewer programmers, but what it does need is computer science graduates who can program. The problem is that the current crop of so-called programmers cannot produce bug-free code, therefore their code should be illegal. While CS professors claim to have a method of programming, they have so far been unable to teach the IT industry.
"As a consequence, recommended swap space is considered a function of system memory workload,"
That sounds good, but then we want a rule linking swap space to workload, or better, linking RAM size to workload.
The reasoning would be that your swap space should be big enough to handle the maximum possible demand, whereas the amount of memory is linked to your 'working set size'. A reasonable measure is to look at your uptime (run queue length). If that is 20 (long-term average fior a system with infinite RAM) that means your working set is 20 threads and you would also need that number of processor cores. A first approximation would be to figure out what programs those threads might belong to, and the amount of memory they need and install roughly that amount of RAM.
A better approach takes the size of those programs and the task switching frequency into account. Or simper:: just shrink the RAM to the point where the effective throughput and response-time is the required level, for instance, half the performance of a machine with unlimited RAM. Or you just llook for the point where the swap device is busy waiting 15 % of the time on average. With SSDs, you would install rather less RAM than with magnetic discs, because the latency of your swap device is so much lower.
A simple rule of thumb is that you need at least the amount of RAM for swap space in order to store a crash dump.