asked 103k views
0 votes
If X1 < x2 and f is a decreasing function, then f(x1) > f(x2).
a) True
b) False

asked
User Bozzle
by
7.9k points

1 Answer

5 votes

Final answer:

True. If X1 < x2 and f is a decreasing function, f(x1) > f(x2) because as x2 increases, f(x2) decreases.

Step-by-step explanation:

True. If X1 < x2 and f is a decreasing function, it means that as the values of x increase, the values of f decrease. So, when X1 < x2, it implies that the value of x2 is greater than x1. Since f is a decreasing function, it follows that f(x1) must be greater than f(x2) because as x2 increases, f(x2) decreases.

answered
User Dmagda
by
8.6k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.