It looks like you're new here. If you want to get involved, click one of these buttons!
You can get much more help if you tell us why you want to call a method every 0.1 seconds. As I said before, there are often other ways of doing things that you have not thought of. Without knowing anything about your application, I can't imagine why you would want to do this.Hi guys. What i mean by not accurate enough is that it is supposed to call every 0.1 seconds but that fluctuates quite a lot. For example, it sometimes calls every 0.14 seconds and then it will call every 0.8 seconds. I think it is because there is quite a lot of code in the selector for the timer?
Ive tightened the log statements around where the _y value is incremented with the yvelocity like soIt makes perfect sense. Floats carry about 7 decimal digits of precision. The extra digits you are printing with %.20f are meaningless. When rounded to 7 significant figures, these logs are quite right. If you do like RickSDK suggests and switch everything to doubles, you will get more precision - about 14 digits, I think. But even then things will not add up if you try to look beyond 14 digits of precision.and here are the strange results Im getting (note both _y and yVelocity are declared as floats globally)
<br />NSLog(@"y before %.20f", _y);<br />NSLog(@"yvel_____ %.20f", yVelocity);<br />_x = _x + xVelocity;<br /> _y = _y + yVelocity;<br /> <br />NSLog(@"y after_ %.20f", _y);
y before 300.00000000000000000000
y after_ 295.00000000000000000000
y before 295.00000000000000000000
y after_ 290.16665649414062500000
y before 290.16665649414062500000
y after_ 285.50000000000000000000
y before 285.50000000000000000000
y after_ 281.00000000000000000000
This is insane whats going on ?