Earlier, when images were to be added to staging/production, they were committed to the git repository and our deploy script used to take care of uploading the images to CDN (
S3 in our case).
But as we improved our design, the number and size of images began increasing. It was then decided that we will not be uploading retina images to the repo and they will be uploaded directly to
This is a good enough workflow on paper, but it has its own fair share of issues. Especially when certain fixes/changes need to be deployed urgently and the images are forgotten.
How do web services handle image uploads for their websites?
The reason we are not uploading to the repo is because it causes it to bloat a lot. Is that an issue we should overlook for ease of workflow?
Well, binary files aren't supposed to be part of your repository indeed. As an alternative scenario, you should probably be looking at deploying code from repo to hosting platform and static files (images/documents/...) to another destination, a CDN for example.
Another alternative for web might be ton include some of the images as base64 encoded strings in your CSS, but this calls for a very heavy CSS and shouldn't be done for "content" images, rather "UI" ones like icons or so...
You can use a service that provides an image management solution including uploads, storage, administration, manipulation, and delivery.
In one of my projects I had to upload images using PHP. I could do it the barebones way and write a PHP upload script but instead I've decided to manage my file upload with a service like Cloudinary.
Cloudinary provides an API for uploading images and any other kind of file to the cloud takes away the pain of having to write large amounts of code to interact with their API by providing an open source PHP library