In the following code that generates a log-log plot, python picks an x range that does not show the points on the scatter plot
import matplotlib.pyplot as plt
plt.scatter([.005,.005],[1,2])
plt.xscale('log')
plt.yscale('log')
plt.show()
Additionally, if you instead use the following code, it works fine:
import matplotlib.pyplot as plt
plt.plot([.005,.005],[1,2])
plt.xscale('log')
plt.yscale('log')
plt.show()
Usually python picks a good range for both the x and y values. Why does it not in this case?
Please note that I am aware it is possible to change the x and y ranges so that the default is not used. My question is specifically asking why python does not choose a good range in this case.
Thanks
I appears this is a duplicate of Why does matplotlib require setting log scale before plt.scatter() but not plt.plot()?
To anyone reading this, please let me know if I should remove this question or if you'd recommend something else. Thanks
plt.plot([.005,.005],[1,2], 'o')works fineplt.plotwithoas markers