SGDRegressor() giving me different outputs everytime?

Why does everytime I train a model through SGD I get different predictions everytime for the same data and therefore the rmse loss is different everytime I trained the data?
Also this did not happen in Ordinary least square method!!
For better understanding i ll put my code also.

from sklearn.linear_model import SGDRegressor
model_2= SGDRegressor()
inputs= nonsmoker_df[['age']]
targets= nonsmoker_df.charges
model_2.fit(inputs,targets);
predictions = model_2.predict(inputs)
rmse(targets,predictions)

This is a problem related to setting seeds for all RNGs in order to keep every run the same.
https://pytorch.org/docs/stable/notes/randomness.html
Albeit the torch doesn’t guarantee that the results will be fully reproducible.