I'm not trying to bust your chops man, really just trying to help you see what you're really getting out of BTC mining. To my knowledge, there is no such thing as a Pentium 4 Celeron, it's either a Pentium 4 or a Celeron/Celeron D. I'm guessing it's a P4, because the fastest Celeron ever made was 3.6 GHz. There's a bunch of P4's at 3.8 GHz including the P4 HT 3.8F, the P4 HT 570J, or the P4 HT 571, all of which are running at least 115 Watts, not the 73 watts, which makes things even worse from a cost standpoint. Any of those parts are going to suck up 82.8 Kilowatt hours/month. At 5 cents a kilowatt hour, your spending $4.14 in power, and more likely at 10 cents or more a kilowatt hour, your spending $8.18+ in power. All for 13 cents of BTC. Do you know what your power costs are per kilowatt hour? It's a relatively easy calculation to determine your monthly costs.
I understand you can't afford a video card good enough for mining and it's awesome your interested in BTC/mining, but you're _paying_ at least $4 and likely upwards of $8-$10 a month just to mine, and your killing your CPU in the process. Stop mining for a few months and stick $10 in a jar each month. You can go on eBay and pick up a 5770 for $40-$50 that will mine 163 times more BTC than you are currently doing with your CPU at the same or even lower power use. Just about any video card made in the past 3 or 4 years is going make you more BTC per power input than that CPU. To be honest, even awesome GPU's won't be worth mining with if ASIC's hit the scene in the next few months.