At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque corrupti quos dolores et quas molestias excepturi sint occaecati cupiditate non provident, similique sunt in culpa qui officia deserunt mollitia animi, id est laborum et dolorum fuga. Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio cumque nihil impedit quo minus id quod maxime placeat facere possimus, omnis voluptas assumenda est, omnis dolor repellendus. Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat.
I just started watching Lecture 8. Hopefully I'll be able to help soon. :)
I think there's some math that I haven't studied that's behind BigO calculations. It seems similar to calculus, calculating the rate of change, i.e. differentiating, but the specifics are different enough that it's not the same process.
It's been a very long time since I've done Calculus...I suppose I will have to relearn it all, including algebra...
Differentiating is when you take a function, and find a new function that returns the value of the slope of the first function. So if you have function of x f(x) = 15*x +30, differentiating gets you g(x) = 15, as it's a straight line. This lets you find how quickly the original function grows at any given point. For a straight line, it's like an O(n) function, it grows directly as n grows. For f(x) = 5*x^2 + 3x + 4 differentiating gets g(x) = 10x + 3, so the original function grows as the value of x grows. This ends up being an O(n^2). Like I said it's similar, but not the same. It's all about understanding the rates of change.
I remember some of cal, but not enough to find the differential of something more complicated than f(x) = x2. (would it be 2x?) I'll have to study a bit more there...thanks Keen.
The derivative of f(x) = 2x is g(x) = 2, the derivative of f(x) = x^2 is g(x) = 2x. The general rule is f(x) = x^n g(x) = n*x^(n-1). For some simple derivative examples/rules, lookup the power rule and the chain rule... two pretty strong concepts that cover the math necessary.
Has anyone here worked on the Quiz one? If so - looking for some guidance on problem 4
Regarding quiz one problem 4, try this: def first_N(n): count = 1 num = 1 while count <= n: if num*num % 2 != 0: print num*num count += 1 num += 1
Big O notation in plain English: http://www.cforcoding.com/2009/07/plain-english-explanation-of-big-o.html I just found a few days ago and it was an immense help on understand Big O notation.
In the case of informatic and complexity, the easiest way to think about m(x) = O(n(x)) (m and n are functions) is that : there are integer A and K which exist, and for x from A to infinity, m(x) < K * n(x) So here it means that the complexity, for a big number n of operation is under K*n. So linear. It's an asymptotic way to see where how the growing is. (This course show the temporal complexity in the worst case and when the size of the operation grows) The O doesn't show the spatial complexity or the average complexity, which are also great tools but which are a bit more difficult) MITopencourse is the best !
The algorithm that you select to use in a code shows how you’ve decided to search for a value. The cost of the algorithm depends on the variables involved. Different methods have specific behaviors. The classes of algorithms are: linear, quadratic, log, exponential. Identifying the classes used within code is the “big picture” of lecture 8. The mathematical names of the algorithms describe how the code is behaving. The growth behaviors of algorithms are illustrated in the first 4 code problems on the related resources handout. The first two show what linear and recursive algorithms look like in the code. The quadratic and exponential are located at 18:40 in the video. 1. The behavior of the linear algorithm is stepwise searching. The name of “the order of complexity” is “linear.” 2. The quadratic algorithm behaves by squaring (or cubing) a given procedure within the code. In the video, the order of complexity is n*m. However, if n = m, then you have n^2, and this is quadratic. (19:30 – 20:15). 3. The behavior of log is to cut the search by a fractional amount, such as ½, 1/3, etc. This tends to be efficient because it cuts down the search time by ½, 1/3, etc. 4. The behavior of exponents is outrageously rapid growth that is generally to be avoided. The base case can be found quickly using one algorithm, but is extremely expensive when using a different algorithm. The efficient choice depends on the situation. Analyzing complexity is a matter of selecting the algorithm that is best matched to a given issue. The Towers of Hanoi illustrated the different efficiencies of different methods. Detailed algebra/calculus review for free at http://brightstorm.com/. Hope this helps someone.
Thank you, gravityenergy. Your explanation paints a good conceptual picture.