$$y = v_0t - \frac{1}{2}gt^2$$
where:
y is the height of the object at time t
v0 is the initial velocity of the object
g is the acceleration due to gravity (9.8 m/s2)
t is the time taken by the object to reach height y
When the object hits the ground, y = 0. Substituting this into the equation, we get:
$$0 = v_0t - \frac{1}{2}gt^2$$
$$t = \frac{2v_0}{g}$$
Substituting the given values, we get:
$$t = \frac{2(250 m/s)}{9.8 m/s^2}$$
$$t = 51.02 s$$
Therefore, it will take the arrow approximately 51.02 seconds to hit the ground.