Ace school

with brainly

  • Get help from millions of students
  • Learn from experts with step-by-step explanations
  • Level-up by helping others

A community for students.

If \(d=gcd(a,b)\), then show that \(gcd(a^2,b^2)=d^2\)

Mathematics
See more answers at brainly.com
At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque corrupti quos dolores et quas molestias excepturi sint occaecati cupiditate non provident, similique sunt in culpa qui officia deserunt mollitia animi, id est laborum et dolorum fuga. Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio cumque nihil impedit quo minus id quod maxime placeat facere possimus, omnis voluptas assumenda est, omnis dolor repellendus. Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat.

Join Brainly to access

this expert answer

SEE EXPERT ANSWER

To see the expert answer you'll need to create a free account at Brainly

Given gcd(a, b) = d By linear combination of two numbers as + bt = d                         [where s and t are integers] Divide both sides by d as/d + bt/d = 1 Rearrange the equation s(a/d) + t(b/d) = 1 ==> gcd(a/d, b/d) = 1
@Rohangrr that was not my question. You proved something different from my question.

Not the answer you are looking for?

Search for more explanations.

Ask your own question

Other answers:

no its your answer @2bornot2b
You need to show that the gcd of \(a^2, b^2\) is equal to \(d^2\), and what have you shown? You have shown " gcd(a/d, b/d) = 1 "
Let a and b be integers, not both zero. Assume that gcd(a, b) = 1. Set g = gcd(a2, b2). I need to prove that g = 1. By Theorem 2.1.3 there exist integers x and y such that ax + by = 1. Now do some algebra 1 = 1^3 = (ax + by)^3 = a^3x^3 + 3a^2x^2by + 3axb^2y^2 + b^3y^3 = a^2(ax^3 + 3x^2by) + b^2(3axb^2y^2 + by^3). Set u = ax^3 +3x^2by and v = 3axb^2y^2 +by^3. Thus a^2u+b^2v = 1. Since g = gcd(a^2, b^2), there exists, t ∈ Z such that a^2 = gs and b^2 = gt. Hence 1 = a^2u + b^2v = gsu + gtv = g(su + tv), that is g|1. Since g > 0 we conclude g = 1. Now assume that d = gcd(a, b) > 1. Then there exist j, k ∈ Z such that a = dj and b = dk. By Proposition 2.2.5 it follows that gcd(j, k) = 1. By the first part of this proof it follows that gcd(j^2, k^2) = 1. Since a^2 = d^2j^2 and b^2 = d^2k^2 and since j^2 and k^2 are relatively prime, Lemma 1 implies that gcd(a^2, b^2) = gcd(d^2j^2, d^2k^2) = d2.
@JuanitaM: Plagiarism is not a good idea.
@2bornot2b: HINT: Prime factorization.
1 Attachment
@nikvist you say that gcd(p, q) = 1, why do you say that? I mean did you assume it? Can you make such an assumption?
@2bornot2b: That's not an assumption. It is an (easy) inference.
Please explain @FoolForMath
Let d be the gcd of \(a,b\) then there exist x and y such that \(ax+by=d\implies \frac{a}{d}x+\frac{b}{d}y=1\). So \(\frac{a}{d}\) and \(\frac{b}{d}\) are prime to each other. Is this the reason behind that step of nikvist?

Not the answer you are looking for?

Search for more explanations.

Ask your own question