anonymous
  • anonymous
Not a question: I spent most of the day exploring questions that popped up in my head recently, some basic and some a little bit more subtle. One of them was the extension of the "2nd Derivative Test" for functions of 2 variables to the n-dimensional case. Anybody who's bored is welcome to find mistakes in my work or make suggestions for more elegance.
Mathematics
  • Stacey Warren - Expert brainly.com
Hey! We 've verified this expert answer for you, click below to unlock the details :)
SOLVED
At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque corrupti quos dolores et quas molestias excepturi sint occaecati cupiditate non provident, similique sunt in culpa qui officia deserunt mollitia animi, id est laborum et dolorum fuga. Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio cumque nihil impedit quo minus id quod maxime placeat facere possimus, omnis voluptas assumenda est, omnis dolor repellendus. Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat.
schrodinger
  • schrodinger
I got my questions answered at brainly.com in under 10 minutes. Go to brainly.com now for free help!
anonymous
  • anonymous
This question would be in better company amongst those of http://math.stackexchange.com/.
anonymous
  • anonymous
Maybe, but I don't think it's quite advanced enough work for that.
anonymous
  • anonymous
You and I haven't been visiting the same site. I see a precalculus question there every day.

Looking for something else?

Not the answer you are looking for? Search for more explanations.

More answers

anonymous
  • anonymous
In truth I rarely visit there, but since I'm bored and there are some good mathematicians here too, I figured I'd just put it up for the hell of it.
anonymous
  • anonymous
I'll denote a function of n variables, in general, by \[ f = f(\mathbf{x}) \] where \[\mathbf{x} = x_1,x_2,x_3 ...\] the first order differential of such a function would be \[ df = \sum_{i=1}^n dx_i\frac{\partial f}{\partial x_i}\] the second order differential would be \[ d^2 f = \left(\sum_{i = 1}^n dx_i \frac{\partial}{\partial x_i}\right)^2f(\mathbf{x})\] \[ = \sum_{i = 1}^n \sum_{j = }^n dx_idx_j \frac{\partial^2f}{\partial x_i \partial x_j} = \sum_{i = 1}^n dx_i \sum_{j = }^n dx_j \frac{\partial^2f}{\partial x_i \partial x_j}\] which can be rewritten as \[d^2f(\mathbf{x}) = \mathbf{x}^TH\mathbf{x} \] where H is the Hessian matrix of f defined by \[ (H)_{ij} = \frac{\partial^2 f}{\partial x_i \partial x_j}\]
anonymous
  • anonymous
Oops... I should have really said \[d^2f = d\mathbf{x}^THd\mathbf{x} \] anyway... Assuming the first order variation df vanishes, the sign of the second order variation is the determining factor in whether we are at a minimum, maximum, or something else. At this point, we'll switch gears and move into some matrix theory, though I'll obviously keep the same notation for clarity. If we assume that the function f is at least twice differentiable on the region in question, we can use the equality of mixed partial derivatives to make note that the Hessian matrix is symmetric, i.e. \[H = H^T\] or equivalently \[(H)_{ij} = (H)_{ji} \] From Linear Algebra, we have the fact that n dimensional symmetric matrices have a set of orthonormal eigenvectors (not necessarily having distinct eigenvalues) that span R^n. Therefore, we can write any arbitrary vector dx as a sum of these eigenvectors, which I'll call u -> \[ \mathbf{x} = \sum_{i=1}^n c_i \mathbf{u}_i \] This yields \[ d\mathbf{x}^THd\mathbf{x} = \left(\sum_{i=1}^nc_i\mathbf{u}^T_i\right)H\left(\sum_{j=1}^nc_j\mathbf{u}^T_j\right)\] since the eigenvectors are orthonormal, this reduces to \[ d\mathbf{x}^THd\mathbf{x} = \sum_{i=1}^n(c_i)^2\lambda_i\] where the lambdas are the eigenvalues.
anonymous
  • anonymous
Are you attempting a generalized analytic continuation?
anonymous
  • anonymous
No, I'm just extending to the case of n variables rather than just 2.
anonymous
  • anonymous
Oh right. On your way then. :P
anonymous
  • anonymous
If we're looking for a local maximum, then it's clear that \[d^2f >0\] for any direction we choose, corresponding to any possible choice in the coefficients above, so \[c_i \text{ are completely arbitrary} \] Let's say eigenvalue k is not positive. Then, it's clear that if we chose \[c_i = \delta_{i,k} \] then our condition would not be satisfied. Therefore, for a point to be a local maximum, the eigenvalues of the Hessian matrix must all be positive. Since the determinant of any matrix is the product of its eigenvalues, obviously \[det(H) > 0 \] Similarly, if we seek a local minimum, all of the eigenvalues must be negative. In that case, \[det(H) >0 \text{ if the space has even dimension, and }det(H)<0 \text{ if the space}\] \[\text{ has odd dimension} \] if we have a saddle point, then d^2f changes sign depending on our direction. That means that some of the eigenvalues must be positive and some must be negative.
anonymous
  • anonymous
Pulling back to the 2 dimensional case, both local maxima and local minima are characterized by det(H)>0. det(H) < 0 implies that the eigenvalues have opposite sign, i.e. you have a saddle point. det(H) = 0 implies that one of the eigenvalues is zero, which means that you are at a maximum in one direction but movement along the other direction yields no change -- as though you were walking through a pringles tube.
anonymous
  • anonymous
Got lazy with the TeX, huh?
anonymous
  • anonymous
As far as extending the test to higher dimensions, I conclude that a determinant equal to zero implies that there is neither a local minimum or a local maximum as before, but the other cases are more complicated because we can't deduce the sign of ALL the eigenvalues based purely on the sign of the determinant. Analysis of the eigenvalues of the Hessian is the only recourse.
anonymous
  • anonymous
And no, I just figured that since the only mathematical symbols I had to type were regular QWERTY letters I wouldn't bother.
anonymous
  • anonymous
I will reread in more detail after dinner. Something seems weird, but sometimes I'm just dumb.
anonymous
  • anonymous
By all means. If I've made a mistake I would like to know :)
anonymous
  • anonymous
Sounds solid. I recall a similar problem set in some text. Let's see if I can find it.
anonymous
  • anonymous
Not relevant, but it's important to feed mathematicians. http://upload.wikimedia.org/wikipedia/commons/2/20/Roast_duck_fs.JPG
anonymous
  • anonymous
Have you taken PDE yet?
anonymous
  • anonymous
Like, finished it?
anonymous
  • anonymous
Only the standard, linear ones.
anonymous
  • anonymous
Well, the text is eluding me for now. I will find it maybe when I review regular differential equations.
anonymous
  • anonymous
But otherwise your modest conclusion is solid. :P

Looking for something else?

Not the answer you are looking for? Search for more explanations.