I'm using Python and boto3 to work with S3.
I'm listing an S3 bucket and filtering by a prefix:
bucket = s3.Bucket(config.S3_BUCKET)
for s3_object in bucket.objects.filter(Prefix="0000-00-00/", Delimiter="/"):
content = s3_object.get()["Body"].read()
botocore.exceptions.ClientError: An error occurred (NoSuchKey) when
calling the GetObject operation: The specified key does not exist.
It's safe to assume you are using the 'standard' endpoint. All of this primarily applies to it, and not the regional endpoints. S3 is atomic and eventually consistent. The documentation gives several examples, including this:
A process writes a new object to Amazon S3 and immediately lists keys within its bucket. Until the change is fully propagated, the object might not appear in the list.
You can enable read-after-write consistency, which "fixes" this, by changing your endpoint from
s3client = boto3.client('s3', endpoint_url='s3-external-1.amazonaws.com')