Simple filters don't cause a fixed ratio change in amplitude (dB) for a fixed change in linear frequency (Hz). Making the amplitude change by -0.1 dB/Hz would be very difficult. This could be done with a very carefully designed digital filter.
However, there is a loophole. Since only the average change must be -0.1 dB/Hz over a specified range, this spec is really only telling you what the relative amplitude at the ends of that range need to be.
Let's look more carefully at the problem statement:
after the cutoff frequency from -3dB to -10dB the average decrease in db/Hz = -0.1 db/Hz.
Note that -3 dB is what you get at the rolloff frequency (I don't like to call it "cutoff", it's gradual) of a single pole low pass filter. The spec starting from -3 dB therefore means starting at the rolloff frequency in the frequency dimension, which is 100 Hz in your case. At 0.1 dB/Hz, it takes another 100 Hz to get another 10 dB.
So, the spec you have really only says two things. The filter must have a gain of -3 dB at 100 Hz, and -10 dB at 200 Hz. Alternatively, you could interpret it to mean -3 dB at 100 Hz and -13 dB at 200 Hz. Your wording is a bit vague, but you seem to be paraphrasing the actual problem statement. It would have been better if you showed us the problem exactly as written. You'll have to resolve the ambiguity on your own now.
Since this is a homework problem, I'm not going further and design the filter for you.