An astronaut drops a rock into a crater of the moon. The distance, d(t), in meters, the rock travels after t seconds can be modeled by the function d(t)=0.8t^2. What is the average speed, in meters per second, of the rock between 5 and 10 seconds?
Question
Answer:
The average rate of change of the position of an object with respect to time is the average speed of the object, so to find the average speed of the rock over the interval [5,10], we are going to use the average rate of change formula: [tex]V(t)= \frac{d(b)-d(a)}{b-a} [/tex] where
[tex]V(t)[/tex] is the average rate of change (the speed)
[tex]d(a)[/tex] is the position function evaluated at [tex]a[/tex]
[tex]d(b)[/tex] is the position function evaluated at [tex]b[/tex]
[tex]a[/tex] is the first point in the interval
[tex]b[/tex] is the second point in the interval
We know for our problem that the first point in the interval is 5 and the second point is 10, so [tex]a=5[/tex] and [tex]b=10[/tex]. Lets replace those values in our formula:
[tex]V(t)= \frac{d(b)-d(a)}{b-a} [/tex]
[tex]V(t)= \frac{d(10)-d(5)}{10-5} [/tex]
[tex]V(t)= \frac{0.8(10^2)-0.8(5^2)}{10-5} [/tex]
[tex]V(t)= \frac{80-20}{5} [/tex]
[tex]V(t)= \frac{60}{5} [/tex]
[tex]V(t)=12[/tex]
We can conclude that the speed of the rock between 5 and 10 seconds is 12 meters per second.
solved
general
10 months ago
8165