I've been playing around with a free account on appfog.com and this question came to mind.
What's the difference between scaling memory and scaling instances?
For example, what is the real world performance and reliability difference between these 3 examples?
1 instance with 512MB.
2 instances with 256MB each.
4 instances with 128MB each.
If there is no difference, why wouldn't you keep upgrading 1 instance to the maximum amount of memory before scaling out to 2 instances? That would seem like the simplest approach from a management perspective.
Thanks in advance for your answers!