New option for Scale node.

This is because problem reported by venomgfx on the irc.
If you have a render of 2k with a render size of 25% (and this
problem is for any resolution/size) and you try to use a image
of 1k in the compo, the first thing you do is put a scale node.

Here come the problem, if you set the option "Scene Size" in the
node scale, the buffer output is not the same size that the render.

This is because the "Scene size" work with the image size and
not the render size, so in this case is the 25% of 1k.. not
the 25% 2k.

So this new option "Render Size" scale the output buffer to the
render resolution, taking into account the render size (percentage) too.
This commit is contained in:
Diego Borghetti
2010-07-08 20:58:34 +00:00
parent a9050083fe
commit 24f63b2081
3 changed files with 6 additions and 0 deletions

View File

@@ -383,6 +383,7 @@ void ntreeGPUMaterialNodes(struct bNodeTree *ntree, struct GPUMaterial *mat);
#define CMP_SCALE_RELATIVE 0
#define CMP_SCALE_ABSOLUTE 1
#define CMP_SCALE_SCENEPERCENT 2
#define CMP_SCALE_RENDERPERCENT 3
/* the type definitions array */

View File

@@ -1176,6 +1176,7 @@ static void def_cmp_scale(StructRNA *srna)
{0, "RELATIVE", 0, "Relative", ""},
{1, "ABSOLUTE", 0, "Absolute", ""},
{2, "SCENE_SIZE", 0, "Scene Size", ""},
{3, "RENDER_SIZE", 0, "Render Size", ""},
{0, NULL, 0, NULL, NULL}};
prop = RNA_def_property(srna, "space", PROP_ENUM, PROP_NONE);

View File

@@ -64,6 +64,10 @@ static void node_composit_exec_scale(void *data, bNode *node, bNodeStack **in, b
else if(node->custom1==CMP_SCALE_SCENEPERCENT) {
newx = cbuf->x * (rd->size / 100.0f);
newy = cbuf->y * (rd->size / 100.0f);
}
else if (node->custom1==CMP_SCALE_RENDERPERCENT) {
newx= (rd->xsch * rd->size)/100;
newy= (rd->ysch * rd->size)/100;
} else { /* CMP_SCALE_ABSOLUTE */
newx= MAX2((int)in[1]->vec[0], 1);
newy= MAX2((int)in[2]->vec[0], 1);