I have three examples of what I think to be kind of the same thing:
In an SQL database, using either:
..can prevent a database thread from losing an update from a separate database thread.
Likewise, using the
synchronized keyword in Java can prevent a global session or static variable from losing an update by ensuring that only one Java thread at a time can access the variable to change it's value.
And finally a network shared physical resource, such as a printer can receive multiple requests to print a job, but can really only print one job a time. However, when more than one job is requested, the print queue adds the job to the list of existing jobs waiting to be processed.
Now my question is are all of these examples of a similar pattern in computer programming, or are they different? And if so what are this/these pattern(s) called?
I've seen that the SQL example is refered to as "avoiding the lost update", and the Java
synchronized example is more along the lines of "blocking" or "locking" or "semiphore"
The printer example seems like it's something else since the other incoming jobs are placed into a queue waiting for the opportunity to use the resouce.