asked 42.8k views
2 votes
A computer takes 3x2 + 2 milliseconds to process a certain program. If the program has 4 lines of static code (this will always be required for the code to run) and x variable lines, what is the average amount of time it takes to process each line?

2 Answers

2 votes

Solution:

Average time per line = total time / total lines

Average time per line = (3x^2 + 2)/(x + 4)

Average time per line = 3x - 12 + (3x^2 + 2)/50 milliseconds

answered
User Edinson
by
7.9k points
4 votes

Answer:

Average time taken to process each line is
3x-12+(50)/((x+4)) milliseconds

Explanation:

A computer takes
3x^(2) + 2 milliseconds to process a certain program.

If the program has 4 lines of static code and x variable lines, then total lines to process will be

⇒ Total lines = (x + 4)

Now average amount of time to process each line = (Total time to process a program) ÷ (Total lines to process)

Average time =
(3x^(2)+2)/(x+4)

=
3x-12+(50)/((x+4)) milliseconds

So the answer is average time taken to process each line will be
3x-12+(50)/((x+4)) milliseconds.

answered
User Otra
by
7.9k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.