## anonymous 4 years ago Multiple Choice: A high-tech company purchases a new computing system whose initial value is V. The system will depreciate at the rate f = f(t) and will incur maintenance costs at the rate g = g(t), where t is the time measured in months. The company wants to determine the optimal time to replace the system.

1. anonymous

2. anonymous

Anyone? I really need help in this one

3. campbell_st

well the depreciation will be something like $D=v(1 - f)^t$ V= initial value and f= rate of depreciation t = number of months... so if V = \$10000 and f = 5% per month... the After 1 month the value is D = 10000 x (1-0.05) of 10000x0.95... After 10 months D = 10000 x(1-0.05)^10 the optimum will be the point(s) of intersection of Maintenance costs and and Depreciation In the graph below I've assumed maintenance is linear |dw:1327794673018:dw|

4. anonymous

So how do I find T?

5. anonymous

I have to submit this in 15 min, any further help would be greatly appreciated

6. campbell_st

$V timesf(t) = V \times(V/17 - (Vt)/578)$ then since $V = Vx(v/17 - (Vt)/578)$ solving $V/17 = Vt/578$ Gives t = 34