One of the usecases is to create mesh from an object is a manner similar to
how Apply Modifiers does it, and have it in the bmain so it can be referenced
by other objects.
This usecase is something what went unnoticed in the previous API changes, so
here is a followup.
Summary of changes:
* bpy.meshes.new_from_object() behaves almost the same as before this change.
The difference now is that it now ensures all referenced data-blocks are
original (for example, materials referenced by the mesh).
* object.to_mesh() now creates free-standing Mesh data-block which is outside
of any bmain. The object owns it, which guarantees the memory never leaks.
It is possible to force free memory by calling object.to_mesh_clear().
Reviewers: brecht
Reviewed By: brecht
Differential Revision: https://developer.blender.org/D4875
Main goal here is to make it obvious and predictable about
what is going on.
Summary of changes.
- Access to dependency graph is now only possible to a fully evaluated
graph. This is now done via context.evaluated_depsgraph_get().
The call will ensure both relations and datablocks are updated.
This way we don't allow access to some known bad state of the graph,
and also making explicit that getting update dependency graph is not
cheap.
- Access to evaluated ID is now possible via id.evaluated_get().
It was already possible to get evaluated ID via dependency graph,
but that was a bit confusing why access to original is done via ID
and to evaluated via depsgraph.
If datablock is not covered by dependency graph it will be returned
as-is.
- Similarly, request for original from an ID which is not evaluated
will return ID as-is.
- Removed scene.update().
This is very expensive to update all the view layers.
- Added depsgraph.update().
Now when temporary changes to objects are to be done, this is to
happen on original object and then dependency graph is to be
updated.
- Changed object.to_mesh() to behave the following way:
* When is used for original object modifiers are ignored.
For meshes this acts similar to mesh-copy, not very useful but
allows to keep code paths similar (i.e. for exporter which has
Apply Modifiers option it's only matter choosing between original
and evaluated object, the to_mesh() part can stay the same).
For curves this gives a mesh which is constructed from displist
without taking own modifiers and modifiers of bevel/taper objects
into account.
For metaballs this gives empty mesh.
Polygonization of metaball is not possible from a single object.
* When is used for evaluated object modifiers are always applied.
In fact, no evaluation is happening, the mesh is either copied
as-is, or constructed from current state of curve cache.
Arguments to apply modifiers and calculate original coordinates (ORCO,
aka undeformed coordinates) are removed. The ORCO is to be calculated
as part of dependency graph evaluation.
File used to regression-test (a packed Python script into .blend):
{F7033464}
Patch to make addons tests to pass:
{F7033466}
NOTE: I've included changes to FBX exporter, and those are addressing
report T63689.
NOTE: All the enabled-by-default addons are to be ported still, but
first want to have agreement on this part of changes.
NOTE: Also need to work on documentation for Python API, but, again,
better be done after having agreement on this work.
Reviewers: brecht, campbellbarton, mont29
Differential Revision: https://developer.blender.org/D4834
Commit b6d7cdd3ce changed how the mesh data
is deformed, which wasn't taken into account yet in this unit test.
Instead of directly reading the mesh vertices (which aren't animated any
more), we convert the modified mesh to a new one, and inspect those
vertices instead.
Houdini writes vertex data in a different format than Blender does; Houdini
uses "face-varying scope", which means that the vertex colours are indexed
by an ever-increasing number over all vertices of all faces instead of the
vertex index.
I've also merged the read_custom_data_mcols() and read_mcols() functions,
because the latter was only called from the former, and the changes in this
commit would add yet more function parameters to pass.