The DIPs are historical artifacts. Until the early or mid 1990s, most monitors used a fixed frequency. For all intents and purposes, this meant that a monitor ran at particular horizontal and vertical resolutions. If you wanted to use a higher resolution, or have a higher refresh rate, you would need to use a different monitor.
Which raises a problem: how do you figure out what type of monitor is attached to the computer? In the really early days, a video card would just assume that a compatible monitor was attached. Some vendors allowed you to select the resolution and refresh rate from a control panel. Yet other video cards had a piece of firmware that allowed you to select an approximate resolution and refresh rate during boot. But Apple selected a different route: let the monitor tell the video card what it is.
I think that modern monitors tell the video card about itself by sending a signal down the video cable. But that presents a couple of problems. Old monitors had a physical shutoff switch. Turn the monitor off, and none of the circuits have power. So a powered down or unplugged monitor could not tell the computer what it is. The other issue is that the electronics to generate that signal would add an extra cost to already pricey equipment. So Apple decided to connect particular pins together. Each pin combination representing a different monitor type. The computer could scan the pins, even when the monitor was turned off. Of course, to properly use a PC monitor the adapter would either be hard-coded to a particular monitor type or use dips to allow the user to select the monitor type.