Isn't the tip(per) size (tipper quantity) irrelevant?

If I double the surface area on the rain gauge then sure I get double the water pouring through the funnel.

In other words it'll tip twice instead of once for the same rain rate.

if 1mm of rain falls over a surface area of 100mm

^{2} that would be: 100*1 = 100ml

if 1mm of rain falls over a surface area of 200mm

^{2} that would be: 200*1 = 200ml

So I would have to adjust the WD resolution to 0.5 i.e. for every tip only count half the quantity.

That's how I understand it from the WD setup below:

So if my tipper catches 2ml per tip and I double the surface area on the rain guage

the tipper is catching double the amount of water i.e. it will tip twice: 2*2ml = 4ml.

Therefore WD

**must** only record half of it (4ml*0.5 = 2ml).

Am I missing something here?

By the way I did some tipping on my rain guage (WMR200) and I'm get conflicting reports from the console, the WMR gatherer and WD logs.

`------------ ----- ----- ----- -----`

Time of tip> 14h30 15h16 17h30 TOTAL

------------ ----- ----- ----- -----

WMR Console 1.1mm 1.0mm 1.1mm 3.2mm

WMR Gatherer 9.7 10.8 11.8 ?

WD Log 1.3mm 1.1mm 1.0mm 3.4mm------------ ------ ------ ----- ------

The previous record in the log on the 4Feb2009 @ 16h55 recoded 0.7mm.

No other rain for the day. I also see some tips of 0.8mm in the logs.

Now I wonder what my tip size is.

Is WD making it's own rain or did I mess up a configuration somewhere.