Throw the multimeter in the bin - it must be faulty.
Seriously, though, deciBels (dB) are always relative to something. Absolute levels are measured in units such as dBu or dBm or dBV. Your recorded tracks are probably -20 dB to -15 dB relative to full-scale on your interface, so they are a good level to be working at. If full scale is (say) +24dBu, then when your tracks are output via your interface's D-A converters, the absolute levels will be +4 to +9 dBu. The multimeter will have a different idea of what its 0dB is, unless it happens to know about dBu or dBm, when it would read +4 to +9 dB for the above example.
So to make sense of your readings, you have to know (1) the full-scale level of your interface (0dBFS) in dBu, and (2) what 0dB represents on the multimeter in dBu. If the answer to (1) is X dBu and the answer to (2) is Y dBu, then you have to add (Y - X) to your multimeter readings to read the same as in your DAW. Note, however, that most multimeters are calibrated for sinewaves using a form factor (rms/mean) of 1.11, and will not give the same levels as your DAW (even after the correction) for typical music signals. This is not the case for "true r.m.s." multimeters, when the readings should agree with your DAW after the level correction.