I don’t understand this whole Ghz-hr thing, can you explain it to me?
Sure. Imagine you have a computer with a single CPU that has only one core. Set the way back machine and assume that core runs at exactly 1Ghz. If we rendered your scene file on that machine and it took one hour to complete, the consumed “compute time” is called 1 Ghz-hr.
Obviously, our computers have many more cores than 1, and they all run at a significantly higher clock speed than 1Ghz, but the notion of a Ghz-hr is still applicable. We take the number of cores, times the clock speed of those cores, times the render time on that CPU, times the power rate you select when you submit the job ($/Ghz-hr). That equals the total cost of your job.
The Ghz-hr method of charging is most widely used by the render farms around the world, as it is the most accurate for asymmetric farms that consist of varying core counts or varying clock speeds per node. It’s also a fair way of directly comparing us to our competitors assuming all other factors are equal.
We’ve added a render cost estimator to our Pricing page to help you with all the math.