I've been having some problems with a time-based application I'm developing, which uses Java in the Android client side and Python in the server side. I encountered a discrepancy of -3 hours (which is exactly the timezone from the Android device) in the times reported by Java and Python, even though in both sides I'm getting the time in a way that (according to SO and other sites) returns a UTC time. In my client, I'm using
Date utcNow = new Date()
and in the server, i use
datetime.datetime.now()
Yet, when run at the same time, they yield 2013-09-09 11:52:16 and 2013-09-09 14:52:16, respectively. I suspect the issue is on Java side, since running date in the server returns the same value as Python (obviously), with the timezone UTC, and the same command in the Android device returns a BRT (GMT-3) time.
While searching for timezone adjustment on both languages, all answers claimed that the methods above would return a UTC date, yet this difference is clearly visible.
How can I convert the Android device's time to UTC?
Right now I'm using a really ugly (and extremely deprecated) solution: utcNow.setHours(utcNoe.getHours()+3)
EDIT: Side note: From Jon's answer, I noticed Java's Date objects are equivalent to Python's naive datetime objects.
Dateisn't aString... if you callDate.toString()that will show you the value in the system local time zone but that isn't information which is in theDatevalue itself.Dates in client side. This is more a conceptual problem than a programming language problem. Imagine the server is configured with UTC 0 and I use a client from Peru (where I live) with UTC -5 and send a request to the server with aDatecreated using my time zone.Datedoesn't have a time zone. It's just an instant in time.SimpleDateFormat? If not, that's the problem.