We currently have one Jenkins server that automates deployments for about 15 different Java web applications. Each application has three deployment environments on separate Linux boxes.
In Jenkins every application has it's own "view". Inside every view there is a job for every branch. Those jobs simply build a war file and place it on the Jenkins server. Then to deploy a DevOps person will SSH into the deployment environment (whether its dev, test, prod) for a particular application and run a SCP script that moves the war file from the Jenkins server to the deployment environment (usually into a Tomcat webapps folder for deployment).
Obviously this is a very silly process. I want to have a Jenkins deployment process that actually deploys the application (so no one needs to SSH into a box and run an SCP script shell script). I have got this process working for some of our smaller applications, but I am running into two issues.
On issue #1 - An obvious brute force way would be to create a job for each deployment environment. However, in that case I don't know how to make it so that the "Test deployment environment" job can take in as a parameter the branch I wish to deploy. I don't want to have to make three jobs for every new branch (one for each deployment environment). Additionally, I could create a job for every branch, but then I don't know how send the deployment environment in as a parameter? I don't want to have to reconfigure these jobs each build which is the only way I've found thus far.
On issue #2 - Is there any type of plugin or generally strategy used to protect against nefarious production deployments? I realize this question might be heavily based off my answer to question #1. Is having a separate account specifically for production builds an efficient way to protect against this?