We live in a world of color and technology. And because technology handles colors in a different way than Humans perceive them, we have to account for that when we work with color. Below is a brief summary of the how and why of colorspace.
sRGB or Perceptual Color space
The standard for viewing most content is called sRGB. This color space is defined by a specific chromaticity and gamma curve. Wikipedia sRGB Most viewing devices are designed with this standard in mind and even the internet has adopted this color space as the standard for viewing.
Why this one and not another one?
The original 1953 standards (NTSC) selected a camera gamma of 1.0/2.2 This is chosen based on assumptions regarding the viewing conditions of the image as well as the actual non-linear light response of the human eye. So there has been some thought given to the best calculations for reproducing an image. You are also taking information captured by a camera that cannot possibly be displayed on a screen, like the actual luminance of the Sun for example. This needs to be correctly represented to the human eye as well. An 8-bit image, for example, only has a certain amount of space to represent a realistic image (bit depth or reproducible colors). A perceptual adjustment is made based on Human vision to make this possible. For a more detailed breakdown of available color depth, look here: Wikipedia Color Depth
CRTs also have a non-linear power transfer parameter called gamma (CRTs used an electron “gun” to light up your pixels and this was not a linear transfer of energy.) Wikipedia Gamma Correction
I have and LCD monitor, why do I want to deal with this?
Keeping in mind the history of a 2.2 gamma curve, there are a myriad of things that use this sRGB color space as their default. Everything from analog video to TIFF has been based on this color space as the default. Changing this to something else would cause a great deal of confusion and loss of compatibility.