compute a timestamps based on 13-digit unixtime timestamp in ms

ghz 12hours ago ⋅ 5 views

compute a timestamps based on 13-digit unixtime timestamp in ms in python

I want to compute the timestamp that is 300 milliseconds before and after the given 13-digit unixtime of the system. I checked the question here to convert 13-digit unixtime to timestamp.

Then, given the timestamp, I wrote a simple code in python to double-check whether the outputs are correct or not.

Assume the given timestamp of the system is 13-digit unixtime and equal to X = "1396226255964". Now I need to compute the new timestamp "Y1" that is "300 milliseconds" before X and "Y2" that is "300 milliseconds" after X.

Is this code computes Y1 & Y2 correctly?

X = "1396226255964"
Y1 = int(X) - int (300000)
print("Y1:", Y1)
Y2 = int(X) + int (300000)
print("Y2:", Y2)

Outputs: Y1: 1396225955964 Y2: 1396226556620

Answer

Yes, your code is almost correct, but there is a small issue in the computation of Y2. Let's walk through your approach step-by-step and discuss it in detail.

1. Understanding the 13-digit Unix timestamp (in milliseconds):

A 13-digit Unix timestamp (like 1396226255964) represents the number of milliseconds since the Unix epoch (January 1, 1970). So, when you subtract or add milliseconds to this value, you're correctly manipulating the timestamp in terms of milliseconds.

2. The goal:

  • Y1: Timestamp 300 milliseconds before X.
  • Y2: Timestamp 300 milliseconds after X.

3. The computation logic:

  • To subtract 300 milliseconds, you should subtract 300,000 microseconds (since 300 milliseconds = 300,000 microseconds).
  • To add 300 milliseconds, you should add 300,000 microseconds as well.

4. The code analysis:

X = "1396226255964"  # 13-digit Unix timestamp in milliseconds

# Y1 is 300 milliseconds before X
Y1 = int(X) - int(300000)  # Correct

# Y2 is 300 milliseconds after X
Y2 = int(X) + int(300000)  # Correct, but check output below

Checking the output:

  1. Y1 (300 milliseconds before):

    • Y1 = 1396226255964 - 300000 = 1396225955964 (correct)
  2. Y2 (300 milliseconds after):

    • Y2 = 1396226255964 + 300000 = 1396226555964 (you had a small mistake here in your expected output: 1396226556620 has an extra 60 at the end).

5. Final corrected output:

X = "1396226255964"
Y1 = int(X) - 300000
Y2 = int(X) + 300000

print("Y1:", Y1)  # Y1: 1396225955964
print("Y2:", Y2)  # Y2: 1396226555964

Conclusion:

Your code is correct for computing Y1 and Y2, but your expected output for Y2 had a small error. The correct result should be 1396226555964, not 1396226556620.