In general the matrices that we use to transform with are not square, that is the number of rows and columns are different. We use an m x n matrix on the right to multiply an n x 1 vector ( a vector is a matrix with short rows ! ) on the left, giving an m x 1 vector. For example :
\[\left[\begin{matrix}1& 2 \\ 3 & 4 \\ 5 & 6\end{matrix}\right]\left[\begin{matrix}7\\ 8 \end{matrix}\right]=\left[\begin{matrix}23\\53\\83\end{matrix}\right]\]that is \[Ax = b\]This implies b and the columns of A are in the same space of dimension m ( here 3 ), but x is of dimension n ( here 2).
So you are right : in general x and b are from different dimensioned spaces. We consider using the components of a vector ( x ) from one space determining the coefficients to use within a linear combination of vectors ( the rows of A ) from another space. Hence the comment that b is a linear combination of the columns of A, or b is in the rowspace of A etc. The whole Ax = b thing ( as matters are usually presented ) is finding, if any, suitable x's for a given A and b.
Now if A is square then I guess it is a matter of language - or more probably the meaning of the underlying problem or application being described - as to whether x, b and the columns of A are 'in' the same space. Many physical problems have A creating b as a physical transform ( movement, rotation, reflection ) of x, for instance, and so may be reasonably all lumped together. But you might be doing accounting, so the vectors are prices and quantities ie. not in a natural sense 'the same'.