All electronic cameras perform some kind of processing between the pickup device and recording of the data. Even olde fashioned analog video had factors known as gamma, knee curve, etc. For that matter, celluloid and paper photography is not 100 percent "honest" either because these artificial systems do not see the world the way the human eye (and brain) does. For example, the technology to reproduce the "latitude" of the real world—meaning the gamut of energy from deepest black to blinding white—is only now becoming possible with HDRI (high dynamic range imaging). So everything less than that is a compromise of exposure. The same goes for "color temperature," which even your brain performs automatically. So are you really seeing the world as it is?
Most post-processing is a formula culling of data. In digital cameras, the DSP (digital signal processor) sits between the pickup and the recorder. Many higher end DSLRs can record in a "RAW" format which removes much of the DSP. The result is a much larger file that can be "tweaked" by the photographer at a later time. Instead of having to "bracket" a shot by maybe a half stop, or even a full stop, the camera stores the full gamut of what its pickup can see. There are many high end, professional "digital cinema" cameras that record with essentially no DSP, but these things are incredible data hogs. They are generally used for live action special effects plates, while the main portion of the movie is captured with more conventional "video" cameras.
Aside from in-camera DSP, there may be a host of reasons to launder the data. Adjustable plane film cameras were often used to photograph architecture—maintain the integrity of all the verticals. Nowadays there are many "distortion" filters to correct such curves caused by the camera lens. Is that really cheating? (
The following example demonstrates this function, although specialized filters may be much subtler and less "brute force" about it.)
Panoramic stitching is actually much closer to what space telescopes and space probes require. Space photos may be made up of many exposures (like the panoramic stitching noted above) and at a variety of wavelengths. The mountain of photos of a moon or planet must be stitched together, then "lens corrected" for the movement of the spacecraft, as well as compiling all the daylight photos into a planet-wide image. The software engineers do their best to make filters that are as "honest" as possible. When colors or elevations are exaggerated, it is usually noted (such as the Astronomy Picture of the Day site).
But is all of this what you would really see if you were there in person? No, actually it is better.