I love the fact it's printed on Sinclair silver thermal paper
Does it depend on the number of users?
It's easy to search the interwebs for this stuff, so here is my interpretation of that mixed with a bit of practical experience if anyone can be bothered to read it
It depends on several things.
Yes, the total bandwidth will be shared between however many devices you have, but something streaming HD video will use a lot more than your phone on standby that will check email/WhatsApp etc notifications once every X minutes.
The frequency the devices support. The original 2.4GHz frequency you would be lucky to get 100Mbps sat next to the router. 5GHz should get you up to 500Mbps.
It's not linear because the technology uses other clever electronics tricks to make it go faster, like using multiple channels.
If you are using 2.4GHz in your garden on a drizzly/foggy day it will be much worse as the frequency is very close to what your microwave oven uses to heat (be absorbed by) water.
It can be a problem if you live in high density housing as everybody's WiFi will be fighting each other. Like the old days of turning the tuning dial on a MW radio.
Now, more modern WiFi devices claim they support WiFi 6 & 7, which get very fast, but that's not really much good unless your purchased top end new devices that support those speeds.
You will find that almost every device you have will have a cheap WiFi chip that supports both 2.4 and 5Ghz at their standard rates and costs manufacturers 10p
If you look at the blurb on any WiFi device you buy it will have a collection of numbers & letters like 802.11acn.
The number refers to the general WiFi standard.
The letters refer to what frequency & speed it supports, so they are the important bit.