I am doing the readings for lecture 7-8 (the pdf one) and I don't understand theorem 1 and harmonic numbers under the block stacking section. The main thing I don't understand is where their formulas are coming from but an overall explanation would be nice too. Here is a link to the notes http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-00-introduction-to-computer-science-and-programming-fall-2008/readings/l11_sums2.pdf
MIT 6.00 Intro Computer Science (OCW)
Stacey Warren - Expert brainly.com
Hey! We 've verified this expert answer for you, click below to unlock the details :)
At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque corrupti quos dolores et quas molestias excepturi sint occaecati cupiditate non provident, similique sunt in culpa qui officia deserunt mollitia animi, id est laborum et dolorum fuga.
Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio cumque nihil impedit quo minus id quod maxime placeat facere possimus, omnis voluptas assumenda est, omnis dolor repellendus.
Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat.
I got my questions answered at brainly.com in under 10 minutes. Go to brainly.com now for free help!
Fact 1 is just a center of mass equation - it describes the center of mass of two objects as if they are one object.
Sorry but without a specific question this is going to be long-winded - but I'll try. thnx for the question by the way, this is all pretty interesting stuff and I had forgotten it already (that's my downfall).
Do you understand the concept of proving something using inductive reasoning? - I had to read a bit on that, pretty sure I started at wikipedia
In classes or lectures or textbooks I guess I've never wondered where the equations in a theorem come from - they know more than I do - and the object is to understand the proof. A theorem that is presented to you is always "someones else's" idea - the question is can they prove it? Or if you have an idea that seems to solve/describe a problem - can you prove it.
So somebody made up that equation in theorem 1 because they had an idea. Then they set out to prove it using inductive reasoning. The result of their proof is the center of mass of two objects that they plug into fact 1 and get a more generalized solution to the problem.
It turns out that the solution to the block stacking problem is a Harmonic number. There is a good wikipedia article on this/these.
Notice that that if you graph each term of a harmonic number you get a curve that approaches a final value but never gets there - it is asymptotic.
The next section goes on to show that if you can define the solution to a problem as a harmonic number then you can easily make (correct) assumptions about the magnitude of the final value. Using the the equation for the harmonic number you can find an upper and lower bound to the final value. Notice that the upper and lower bounds are asymptotic to the final value and that as you increase the number of terms in the equation (n->> a bigger number) the error or uncertainty in the upper and lower bounds gets smaller. You can (almost) figure out what the final value is (within some tolerance) without actually calculating it. It's an estimation tool that can give fairly precise estimations.
Finally that was all background information for the concept of Asymptotic Notation. So that it isn't just a word.
This was an excellent reading, each step introduced concepts and laid a foundation for the next so that we/you/i would have a better understanding of what exactly is being said when an algorithm is described as having a certain level of complexity by using asymptotic notation (those six funny symbols) - i love it when they do that.
I might not have explained anything - just reiterated what was already there - sorry. It might be better to ask some specific questions. I guess sometimes when I do that its because I'm trying to understand it myself.
It seems I always end up reading a wikipedia article first and then pick some of the external links at the bottom of the article. Predictably - there seems to be a wealth of in-depth computer science articles on wikipedia. In my wanderings I also ran across the Dictionary of Algorithms and Data Structures at the NIST and that kinda blew me away for some reason - http://xlinux.nist.gov/dads/