The term "Sprite" is a vaguely defined one. It generally refers to pixellated graphics used by older computers to represent things in stuff like video games, but recently it has come to encompass any 2-Dimensional image, most often in the context of video games, I.E. "The trees in this 3D open world game are just sprites, what a rip off!" Or "Duke Nukem's arm is just a sprite, he isn't really rendered."

For the purposes of simplicity, these annotations will be using the older, less vague definition of "pixelated stuff used by older machines" and will refer to higher resolution images as "photos" or "artwork" unless specifically stated.

The most common, dare I say "mainstream" term I've seen associated with sprites is "8-Bit," but as anyone with a passing familiarity with the sprite community will tell you, this is a misnomer. (Kind of.) 8-Bit is a measurement of processing power, and the only reason it is associated with sprites of a specific resolution and color palette is due to the "Bit Wars," a phenomenon of marketing associated with early video game consoles in which manufacturers would use the processing power as a selling point. Most people immediately associated the processing power with the graphics, hence "8-Bit."

Many 8-bit machines had wildly differing graphical resolutions. Some machines even had differing resolutions within their own library. Nintendo's systems were famous for using onboard chipsets within the cartridge to add capability beyond that of the system, or clever tricks of layering to allow colors beyond that of the normally allowed maximum. The PC market, for quite some time, variously rendered things in CGA or EGA, both providing vastly different experiences on computers of roughly the same processing power. The SNES and the Genesis were both 16 bit, but both systems had their own distinct styles of rendering.

My general point is that it's not entirely correct to use a measurement of processing power when referring to something the graphics chip handles. But, since the jump between 8 bit and 16 bit in terms of graphical resolution is usually large enough to notice no matter what machine you're looking at, here comes my whole point.

You shouldn't mix bit resolutions in a sprite comic. You can get away with mixing 16 and 32 bit, kind of, sort of. It depends on what you're doing. But 8 and 16 bit usually don't mix well. This comic makes that mistake, and I'm just pointing it out now so I won't have to again all the other times it happens.

Reader comments

comments powered by Disqus

Support my site