Let `A` and `B` be two independent events from a sample space.
If `text{Pr} (A) = p , \ text{Pr}(B) = p^2` and `text{Pr} (A) + text{Pr} (B) = 1`, then `text{Pr}(A′ ∪ B)` is equal to
- `1 - p - p^2`
- `p^2 - p^3`
- `p - p^3`
- `1 - p + p^3`
- `1 - p - p^2 + p^3`