A community for students.

Here's the question you clicked on:

55 members online
  • 0 replying
  • 0 viewing

anonymous

  • 5 years ago

proof for if two vectors ar linearly dependent that one is a scalar multiple of the other

  • This Question is Closed
  1. anonymous
    • 5 years ago
    Best Response
    You've already chosen the best response.
    Medals 0

    Suppose two vectors u and v are linearly dependent. If either u or v is the zero vector, say u = 0, then u = 0*v. If neither u nor v are zero vectors, then there exists scalars a1 and a2, not both zero, such that a1*u + a2*v = 0. If a1 is 0, then a2 * v = 0, implying a2 = 0 (since v is nonzero). a1 = a2 = 0 contradicts our assumption that we don’t have both scalars being zero. Therefore, a1 can’t be zero. Similarly, a2 can’t be zero. And so they’re both nonzero scalars. Then from a1*u + a2*v = 0, solving for u, we get u = -(a2/a1) * v. So u is a scalar multiple of v.

  2. Not the answer you are looking for?
    Search for more explanations.

    • Attachments:

Ask your own question

Sign Up
Find more explanations on OpenStudy
Privacy Policy

Your question is ready. Sign up for free to start getting answers.

spraguer (Moderator)
5 → View Detailed Profile

is replying to Can someone tell me what button the professor is hitting...

23

  • Teamwork 19 Teammate
  • Problem Solving 19 Hero
  • You have blocked this person.
  • ✔ You're a fan Checking fan status...

Thanks for being so helpful in mathematics. If you are getting quality help, make sure you spread the word about OpenStudy.

This is the testimonial you wrote.
You haven't written a testimonial for Owlfred.