algorithm - Understanding asymptotic notation when given asymptotic performance, time and number of elements -
2nd year computer science student here. in class covering asymptotic notation, sorting algorithms , such. need understanding asymptotic notation , how values relate 1 another. have been @ little while can't seem arrive @ reasonable conclusion. imagine answer simple eludes me. professor provided hand-out goes pretty detail asymptotic notation , how used.
so, hand out asks following questions:
what distinction between big-o , big-theta asymptotic performance?
my conclusions hand-out are:
-big-o performance means algorithm can finish , asymptotic performance worst case scenario.
-big-theta performance means algorithm cannot finish , asymptotic performance best case scenario.
if these assumptions incorrect can sesame street me?
-and-
a wood-chuck requires 20 seconds chuck 60 chucks of wood, , wood-chuck’s wood-chucking algorithm has temporal asymptotic performance theta(n^2). how wood wood-chuck chuck (if wood-chuck chuck wood) in 5 minutes?
i understand since algorithm n^2 can expect amount of time takes chuck more , more wood exponential. don't see how can find amount of wood given in 5 minutes. feel over-thinking , answer simple. appreciated.
if know have o(n^2) algorithm , can x units of work in y seconds, can 2x units of work in approximately 4y seconds. can 4x units of work in approximately 16y seconds.
or, amount of work can in 16*y seconds equal x*sqrt(16).
so amount of work can in z seconds x*sqrt(z). square root of 15 approximately 3.873, meaning can 60*3.873, or 232.46 units of work in 15 seconds.
it's important remember, though, you're working approximations here. asymptotic "math" doesn't give exact numbers. doing exercise in real life (and have done quite similar), i'd estimate 4 times work in 300 seconds did in 20 seconds, because 300 approximately 16 times time.
so answer question "approximately 240 chucks."
Comments
Post a Comment