I am using S3, with my Free Tier and with 20,000 GETS a month.
Now each visitor can rack up 10-20 requests in absence of a caching service. Now I doubt I'll receive the traffic to reach the 20,000.
If my site was accessed via some loop that intentionally with an aim to put a strain on it? The requests might well, exceed 20,000 and I would be left with huge amazon bill without a fault of my own.
Is there a way out?