The task is to write a program that determines the position of two points on the plane, in other words, their coordinates.
There is a system of three segments (the so-called cat's leg), the ends of the first and third of which coincide with the ends of the second. Schematically (without taking into account the angles) you can present it like this: (A) ------ (B) ------ (C) ------ (D) (segments, respectively, AB, BC and CD). The input of the program receives the length of these segments and the coordinates of points A and D. It is known that the angle ABC (that is, formed by the segments AB and BC) corresponds to the angle BCD as c (this parameter is also fed into the program). Find points B and C.
Please help to make this program.
Adding.
The system of equations is as follows:
( x(A) - x(B) ) ^ 2 + ( y(A) - y(B) ) ^ 2 = |AB| ^ 2 ; ( x(D) - x(C) ) ^ 2 + ( y(D) - y(C) ) ^ 2 = |CD| ^ 2 ; ( x(B) - x(C) ) ^ 2 + ( y(B) - y(C) ) ^ 2 = |BC| ^ 2 ; β = c * γ .
Here x (T) is the x coordinate of T; y (T) - y, respectively; | OT | - the distance between points O and T (the length of the segment OT); β is the angle ABC, γ is BCD, c is the given constant.