Got Homework?
Connect with other students for help. It's a free community.
Here's the question you clicked on:
 0 viewing
Not a question: I spent most of the day exploring questions that popped up in my head recently, some basic and some a little bit more subtle. One of them was the extension of the "2nd Derivative Test" for functions of 2 variables to the ndimensional case. Anybody who's bored is welcome to find mistakes in my work or make suggestions for more elegance.
 one year ago
 one year ago
Not a question: I spent most of the day exploring questions that popped up in my head recently, some basic and some a little bit more subtle. One of them was the extension of the "2nd Derivative Test" for functions of 2 variables to the ndimensional case. Anybody who's bored is welcome to find mistakes in my work or make suggestions for more elegance.
 one year ago
 one year ago

This Question is Closed

badreferencesBest ResponseYou've already chosen the best response.0
This question would be in better company amongst those of http://math.stackexchange.com/.
 one year ago

Jemurray3Best ResponseYou've already chosen the best response.0
Maybe, but I don't think it's quite advanced enough work for that.
 one year ago

badreferencesBest ResponseYou've already chosen the best response.0
You and I haven't been visiting the same site. I see a precalculus question there every day.
 one year ago

Jemurray3Best ResponseYou've already chosen the best response.0
In truth I rarely visit there, but since I'm bored and there are some good mathematicians here too, I figured I'd just put it up for the hell of it.
 one year ago

Jemurray3Best ResponseYou've already chosen the best response.0
I'll denote a function of n variables, in general, by \[ f = f(\mathbf{x}) \] where \[\mathbf{x} = x_1,x_2,x_3 ...\] the first order differential of such a function would be \[ df = \sum_{i=1}^n dx_i\frac{\partial f}{\partial x_i}\] the second order differential would be \[ d^2 f = \left(\sum_{i = 1}^n dx_i \frac{\partial}{\partial x_i}\right)^2f(\mathbf{x})\] \[ = \sum_{i = 1}^n \sum_{j = }^n dx_idx_j \frac{\partial^2f}{\partial x_i \partial x_j} = \sum_{i = 1}^n dx_i \sum_{j = }^n dx_j \frac{\partial^2f}{\partial x_i \partial x_j}\] which can be rewritten as \[d^2f(\mathbf{x}) = \mathbf{x}^TH\mathbf{x} \] where H is the Hessian matrix of f defined by \[ (H)_{ij} = \frac{\partial^2 f}{\partial x_i \partial x_j}\]
 one year ago

Jemurray3Best ResponseYou've already chosen the best response.0
Oops... I should have really said \[d^2f = d\mathbf{x}^THd\mathbf{x} \] anyway... Assuming the first order variation df vanishes, the sign of the second order variation is the determining factor in whether we are at a minimum, maximum, or something else. At this point, we'll switch gears and move into some matrix theory, though I'll obviously keep the same notation for clarity. If we assume that the function f is at least twice differentiable on the region in question, we can use the equality of mixed partial derivatives to make note that the Hessian matrix is symmetric, i.e. \[H = H^T\] or equivalently \[(H)_{ij} = (H)_{ji} \] From Linear Algebra, we have the fact that n dimensional symmetric matrices have a set of orthonormal eigenvectors (not necessarily having distinct eigenvalues) that span R^n. Therefore, we can write any arbitrary vector dx as a sum of these eigenvectors, which I'll call u > \[ \mathbf{x} = \sum_{i=1}^n c_i \mathbf{u}_i \] This yields \[ d\mathbf{x}^THd\mathbf{x} = \left(\sum_{i=1}^nc_i\mathbf{u}^T_i\right)H\left(\sum_{j=1}^nc_j\mathbf{u}^T_j\right)\] since the eigenvectors are orthonormal, this reduces to \[ d\mathbf{x}^THd\mathbf{x} = \sum_{i=1}^n(c_i)^2\lambda_i\] where the lambdas are the eigenvalues.
 one year ago

badreferencesBest ResponseYou've already chosen the best response.0
Are you attempting a generalized analytic continuation?
 one year ago

Jemurray3Best ResponseYou've already chosen the best response.0
No, I'm just extending to the case of n variables rather than just 2.
 one year ago

badreferencesBest ResponseYou've already chosen the best response.0
Oh right. On your way then. :P
 one year ago

Jemurray3Best ResponseYou've already chosen the best response.0
If we're looking for a local maximum, then it's clear that \[d^2f >0\] for any direction we choose, corresponding to any possible choice in the coefficients above, so \[c_i \text{ are completely arbitrary} \] Let's say eigenvalue k is not positive. Then, it's clear that if we chose \[c_i = \delta_{i,k} \] then our condition would not be satisfied. Therefore, for a point to be a local maximum, the eigenvalues of the Hessian matrix must all be positive. Since the determinant of any matrix is the product of its eigenvalues, obviously \[det(H) > 0 \] Similarly, if we seek a local minimum, all of the eigenvalues must be negative. In that case, \[det(H) >0 \text{ if the space has even dimension, and }det(H)<0 \text{ if the space}\] \[\text{ has odd dimension} \] if we have a saddle point, then d^2f changes sign depending on our direction. That means that some of the eigenvalues must be positive and some must be negative.
 one year ago

Jemurray3Best ResponseYou've already chosen the best response.0
Pulling back to the 2 dimensional case, both local maxima and local minima are characterized by det(H)>0. det(H) < 0 implies that the eigenvalues have opposite sign, i.e. you have a saddle point. det(H) = 0 implies that one of the eigenvalues is zero, which means that you are at a maximum in one direction but movement along the other direction yields no change  as though you were walking through a pringles tube.
 one year ago

badreferencesBest ResponseYou've already chosen the best response.0
Got lazy with the TeX, huh?
 one year ago

Jemurray3Best ResponseYou've already chosen the best response.0
As far as extending the test to higher dimensions, I conclude that a determinant equal to zero implies that there is neither a local minimum or a local maximum as before, but the other cases are more complicated because we can't deduce the sign of ALL the eigenvalues based purely on the sign of the determinant. Analysis of the eigenvalues of the Hessian is the only recourse.
 one year ago

Jemurray3Best ResponseYou've already chosen the best response.0
And no, I just figured that since the only mathematical symbols I had to type were regular QWERTY letters I wouldn't bother.
 one year ago

badreferencesBest ResponseYou've already chosen the best response.0
I will reread in more detail after dinner. Something seems weird, but sometimes I'm just dumb.
 one year ago

Jemurray3Best ResponseYou've already chosen the best response.0
By all means. If I've made a mistake I would like to know :)
 one year ago

badreferencesBest ResponseYou've already chosen the best response.0
Sounds solid. I recall a similar problem set in some text. Let's see if I can find it.
 one year ago

badreferencesBest ResponseYou've already chosen the best response.0
Not relevant, but it's important to feed mathematicians. http://upload.wikimedia.org/wikipedia/commons/2/20/Roast_duck_fs.JPG
 one year ago

badreferencesBest ResponseYou've already chosen the best response.0
Have you taken PDE yet?
 one year ago

badreferencesBest ResponseYou've already chosen the best response.0
Like, finished it?
 one year ago

Jemurray3Best ResponseYou've already chosen the best response.0
Only the standard, linear ones.
 one year ago

badreferencesBest ResponseYou've already chosen the best response.0
Well, the text is eluding me for now. I will find it maybe when I review regular differential equations.
 one year ago

badreferencesBest ResponseYou've already chosen the best response.0
But otherwise your modest conclusion is solid. :P
 one year ago
See more questions >>>
Your question is ready. Sign up for free to start getting answers.
spraguer
(Moderator)
5
→ View Detailed Profile
is replying to Can someone tell me what button the professor is hitting...
23
 Teamwork 19 Teammate
 Problem Solving 19 Hero
 Engagement 19 Mad Hatter
 You have blocked this person.
 ✔ You're a fan Checking fan status...
Thanks for being so helpful in mathematics. If you are getting quality help, make sure you spread the word about OpenStudy.