2
$\begingroup$

I'm iterating over the material slots in all the meshes in an imported scene, and I'm wondering if there's a way to downsample the image data (using python) associated with the material -- or should I just do it w/the standard python fare (the numpy family). ie I'm assuming blender has some built in bilinear / bicubic downsampling functionality, but I'm new to the API.

...
for i, obj in enumerate(meshes):
    for s in obj.material_slots:
        if s.material and s.material.use_nodes:
            for n in s.material.node_tree.nodes:
                if n.type == 'TEX_IMAGE':
                    select_mat = n.image.size[:]
                    print(obj.name,'uses',n.image.name,'saved at',n.image.filepath, select_mat)
$\endgroup$
2
  • 1
    $\begingroup$ There's image.scale(new_width, new_height). $\endgroup$ Commented Jul 22, 2022 at 4:33
  • $\begingroup$ Seems like a step in the right direction! @scurest does that allow you to specify the filtering? or what does it do? docs are somewhat unclear. $\endgroup$ Commented Jul 22, 2022 at 5:03

1 Answer 1

0
$\begingroup$

I retooled to something like this:

for image in bpy.data.images:
    print(image.size)
    print("scaling n.image", image.size[0], image.size[1])#, "to", nx, ny)
    print(image.name)

    if image.size[0] == 0 or image.size[1] == 0:
        continue
    nx = math.floor(.25 * image.size[0])
    ny = math.floor(.25 * image.size[1])
    print("scaling image", image.size[0], image.size[1], "to", nx, ny)
    image.scale( nx, ny )
    print("new size is:", image.size[0], image.size[1])
    image.pack()

but the question of what downsampling algorithm is applied still remains.

$\endgroup$

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.