C
chip9munk
Hello everybody!
One strange problem, please help!
I have the following 2D array: users_elements_matrix
numpy.shape(users_elements_matrix) is (100,43)
and array merged_binary_ratings
numpy.shape(merged_binary_ratings) is (100,)
Now,when I run:
numpy.linalg.lstsq(users_elements_matrix, merged_binary_ratings)
i get some ridiculous numbers for coeficients, all are the same and 1.38946385e+15.
What is really strange is that if I run
numpy.shape(users_elements_matrix[:,0:42])
i get ok numbers.
I tested several thing and have examined the matrix, everything is ok with the data.
how is it possible that one additional row (variable in linear regression)
has such a strange impact?!!?
I am loosing my mind here, please help!
Thanks!
One strange problem, please help!
I have the following 2D array: users_elements_matrix
numpy.shape(users_elements_matrix) is (100,43)
and array merged_binary_ratings
numpy.shape(merged_binary_ratings) is (100,)
Now,when I run:
numpy.linalg.lstsq(users_elements_matrix, merged_binary_ratings)
i get some ridiculous numbers for coeficients, all are the same and 1.38946385e+15.
What is really strange is that if I run
numpy.shape(users_elements_matrix[:,0:42])
i get ok numbers.
I tested several thing and have examined the matrix, everything is ok with the data.
how is it possible that one additional row (variable in linear regression)
has such a strange impact?!!?
I am loosing my mind here, please help!
Thanks!