Let’s say that your project is a high resolution still image (bigger than the screen size). To estimate beforehand – approximately of course - its cost, you must first have an idea of the time which would be required to render your full resolution image on your computer.
For some scenes, the render time increases linearly with the resolution. If you render a 1024 x 768 image in 15 minutes, then the render time for a 2048 x 1536 version of the same scene will be around 1 hour (4 x more pixels → 4 x more time).
Unfortunately, this simple reasoning is not guaranteed to work if your scene includes procedural elements (terrains, metaclouds, some complex materials). In that case, the render time can increase faster than the resolution. It means that your 2048 x 1536 image could take not 1 hour as expected, but 2 hours!
To determine how your scene behaves, we recommend that you render it a few times in various (low) resolutions, for instance 160 x 120, then 320 x 240, then 640 x 480, and then compare the increase in render time with the increase in resolution.
Depending on the repartition of complex element in the image, the RANCH will typically be 50 to 100 times faster than your computer to render a still image.