Skip to content

Conversation

@dhermes
Copy link
Contributor

@dhermes dhermes commented Apr 8, 2015

Note this was almost in #793 but dropped due to RFC3339 questions.

@craigcitro @thobrla can you vet an assumption for me? Will timestamps returned conform to

_GOOGLE_TIMESTAMP_FORMAT = '%Y-%m-%dT%H:%M:%S.%fZ'
# For example
TIME_DELETED = '2014-11-05T20:34:37.000Z'

I ask because '2014-11-05T20:34:37Z' is also totally valid according to RFC3339 and I'm wondering if we can get away with using datetime alone (and not having to use the strict-rfc3339 library for the full generality of RFC3339).

@googlebot googlebot added the cla: yes This human has signed the Contributor License Agreement. label Apr 8, 2015
@coveralls
Copy link

Coverage Status

Changes Unknown when pulling 9f7d6fc on dhermes:use-native-datetime into * on GoogleCloudPlatform:master*.

@thobrla
Copy link

thobrla commented Apr 8, 2015

I think this is true, but it is an implementation detail so it could break in the future; the discovery document only claims this is in RFC3339 format.

That being said, apitools just uses protorpc, which uses datetime exclusively. @craigcitro may be able to comment further; looking at the code we treat the timestamps as DateTimeFields, https://github.com/google/protorpc/blob/master/protorpc/message_types.py#L52, but these are in millisecond format, not RFC3339 and I can't find where it does the string conversion.

This comment was marked as spam.

This comment was marked as spam.

This comment was marked as spam.

@dhermes
Copy link
Contributor Author

dhermes commented Apr 8, 2015

@thobrla The conversion happens in apitools

@thobrla
Copy link

thobrla commented Apr 8, 2015

Ah; the responsible code is actually in protorpc, and that handles both kinds of string formats (using text search and strptime). That seems like a reasonable approach.

@dhermes
Copy link
Contributor Author

dhermes commented Apr 8, 2015

Yes, I'm cool doing that (only because we will end up requiring protorpc via apitools anyhow).

@craigcitro
Copy link
Contributor

one more heads-up about timestamps: some newer APIs (eg pubsub) may return you timestamps with even more precision. i.e. down to the nanosecond. (see source here)

@dhermes
Copy link
Contributor Author

dhermes commented Apr 9, 2015

@craigcitro WAT? datetime doesn't support nanoseconds, so I suppose I should hold off. Oh Google, le sigh.

@craigcitro
Copy link
Contributor

well, your fix is still good for storage. we can cross the next bridge when we come to it.

@dhermes
Copy link
Contributor Author

dhermes commented Apr 9, 2015

@craigcitro You mean this PR is good or the suggestion to use protorpc.util.decode_datetime?

As it were, I can confirm that the timezone support and the option of not having milliseconds (the only extra features) are not necessary.

From the discovery type documentation:

Type value Format value Meaning
string date-time An RFC3339 timestamp in UTC time. This is formatted as YYYY-MM-DDThh:mm:ss.fffZ. The milliseconds portion (".fff") is optional. Defined in the JSON Schema spec.

@dhermes dhermes force-pushed the use-native-datetime branch from 9f7d6fc to 2645e1d Compare April 9, 2015 16:54
@dhermes
Copy link
Contributor Author

dhermes commented Apr 9, 2015

@tseaver I made an update to the unit tests and also made a slight tweak to the use of mtime in Blob.download_to_filename (was some repeated code).

PTAL

@coveralls
Copy link

Coverage Status

Changes Unknown when pulling 2645e1d on dhermes:use-native-datetime into * on GoogleCloudPlatform:master*.

@tseaver
Copy link
Contributor

tseaver commented Apr 9, 2015

LGTM

dhermes added a commit that referenced this pull request Apr 9, 2015
Returning native datetime objects for Bucket/Blob time properties.
@dhermes dhermes merged commit 3f6ee7b into googleapis:master Apr 9, 2015
@dhermes dhermes deleted the use-native-datetime branch April 9, 2015 17:56
parthea pushed a commit that referenced this pull request Nov 24, 2025
…age (#807)

* chore: upgrade enchant packages in (Owlbot-controlled) docs Docker image

Ubuntu Jammy doesn't have the enchant and libenchant1c2a packages.
So, after googleapis/synthtool#1422 was merged,
which switched the base image of the Dockerfile, it's been unable to
install these packages.

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Co-authored-by: Mariatta Wijaya <[email protected]>
parthea pushed a commit that referenced this pull request Nov 26, 2025
* fix: do not use the GAE APIs on gen2+ runtimes

Currently, this library uses the App Engine API in all environments if
it can be imported successfully. This assumption made sense when the API
was only available on gen1, but this is no longer the case.
See https://github.com/GoogleCloudPlatform/appengine-python-standard

In order to comply with AIP-4115, we must treat GAE gen2+ as a "compute
engine equivalent environment" even if the GAE APIs are importable.
In other words, google.auth.default() must never return an
app_engine.Credental on GAE gen2+.Currently, this library uses the App Engine API in all environments if
it can be imported successfully. This assumption made sense when the API
was only available on gen1, but this is no longer the case.
See https://github.com/GoogleCloudPlatform/appengine-python-standard

In order to comply with AIP-4115, we must treat GAE gen2+ as a "compute
engine equivalent environment" even if the GAE APIs are importable.
In other words, google.auth.default() should not return an
app_engine.Credental on GAE gen2+.

* blacken

Co-authored-by: arithmetic1728 <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cla: yes This human has signed the Contributor License Agreement.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants