Converters
The Converter
interface defines a mapping between
tagged objects in the ASDF tree and their corresponding Python object(s).
Typically a Converter will map one YAML tag to one Python type, but
the interface also supports many-to-one and many-to-many mappings. A
Converter provides the software support for a tag and is responsible
for both converting from parsed YAML to more complex Python objects
and vice versa.
The Converter interface
Every Converter implementation must provide two required properties and two required methods:
Converter.tags
- a list of tag URIs or URI patterns handled by the converter.
Patterns may include the wildcard character *
, which matches any sequence of
characters up to a /
, or **
, which matches any sequence of characters.
The uri_match
method can be used to test URI patterns.
Converter.types
- a list of Python types or fully-qualified Python type names handled
by the converter. For strings, the private or public path can be used. For example,
if class Foo
is implemented in example_package.foo.Foo
but imported
as example_package.Foo
for convenience either example_package.foo.Foo
or example_package.Foo
can be used. As most libraries do not consider moving
where a class is implemented it is preferred to use the “public” location
where the class is imported (in this example example_package.Foo
).
The string type name is recommended over a type object for performance reasons, see Entry point performance considerations.
Converter.to_yaml_tree
- a method that accepts a complex Python object and returns
a simple node object (typically a dict
) suitable for serialization to YAML. The
node is permitted to contain nested complex objects; these will in turn
be passed to other to_yaml_tree
methods in other Converters.
Converter.from_yaml_tree
- a method that accepts a simple node object from parsed YAML and
returns the appropriate complex Python object. For a non-lazy-tree, nested
nodes in the received node will have already been converted to complex objects
by other calls to from_yaml_tree
methods, except where reference cycles are present – see
Reference cycles for information on how to handle that
situation. For a lazy_tree
(see asdf.open
) the node will contain asdf.lazy_nodes
instances which act like dicts and lists but convert child objects only when they are
accessed.
Additionally, the Converter interface includes a method that must be implemented
when some logic is required to select the tag to assign to a to_yaml_tree
result:
Converter.select_tag
- an optional method that accepts a complex Python object and a list
candidate tags and returns the tag that should be used to serialize the object.
Converter.lazy
- a boolean attribute indicating if this converter accepts “lazy” objects
(those defined in asdf.lazy_nodes
). This is mostly useful for container-like classes
(where the “lazy” objects can defer conversion of contained objects until they are accessed).
If a converter produces a generator lazy should be set to False
as asdf will need
to generate nodes further out the branch to fully resolve the object returned from the
generator.
A simple example
Say we have a Python class, Rectangle
, that we wish to serialize
to an ASDF file. A Rectangle
instance has two attributes, width
and height, and a convenient method that computes its area:
# in module example_package.shapes
class Rectangle:
def __init__(self, width, height):
self.width = width
self.height = height
def get_area(self):
return self.width * self.height
We’ll need to designate a tag URI to represent this object’s type
in the ASDF tree – let’s use asdf://example.com/example-project/tags/rectangle-1.0.0
.
Here is a simple Converter implementation for this type and tag:
from asdf.extension import Converter
class RectangleConverter(Converter):
tags = ["asdf://example.com/shapes/tags/rectangle-1.0.0"]
types = ["example_package.shapes.Rectangle"]
def to_yaml_tree(self, obj, tag, ctx):
return {
"width": obj.width,
"height": obj.height,
}
def from_yaml_tree(self, node, tag, ctx):
from example_package.shapes import Rectangle
return Rectangle(node["width"], node["height"])
Note that import of the Rectangle
class has been deferred to
inside the from_yaml_tree
method. This is a performance consideration
that is discussed in Entry point performance considerations.
In order to use this Converter, we’ll need to create a simple extension around it and install that extension:
import asdf
from asdf.extension import Extension
class ShapesExtension(Extension):
extension_uri = "asdf://example.com/shapes/extensions/shapes-1.0.0"
converters = [RectangleConverter()]
tags = ["asdf://example.com/shapes/tags/rectangle-1.0.0"]
asdf.get_config().add_extension(ShapesExtension())
Now we can include a Rectangle object in an AsdfFile
tree
and write out a file:
with asdf.AsdfFile() as af:
af["rect"] = Rectangle(5, 4)
af.write_to("test.asdf")
The portion of the ASDF file that represents the rectangle looks like this:
rect: !<asdf://example.com/shapes/tags/rectangle-1.0.0> {height: 4, width: 5}
Deferring to another converter
Converters only support the exact types listed in Converter.types
. When a
supported type is subclassed the extension will need to be updated to support
the new subclass. There are a few options for supporting subclasses.
If serialization of the subclass needs to differ from the superclass a new Converter, tag and schema should be defined.
If the subclass can be treated the same as the superclass (specifically if
subclass instances can be serialized as the superclass) then the subclass
can be added to the existing Converter.types
. Note that adding the
subclass to the supported types (without making other changes to the Converter)
will result in subclass instances using the same tag as the superclass. This
means that any instances created during deserialization will always
be of the superclass (subclass instances will never be read from an ASDF file).
Another option (useful when modifying the existing Converter is not
convenient) is to define a Converter that does not tag the subclass instance
being serialized and instead defers to the existing Converter. Deferral
is triggered by returning None
from Converter.select_tag
and
implementing Converter.to_yaml_tree
to convert the subclass instance
into an instance of the (supported) superclass.
For example, using the example Rectangle
class above, let’s say we
have another class, AspectRectangle
, that represents a rectangle as
a height and aspect ratio. We know we never need to deserialize this
class for our uses and are ok with always reading Rectangle
instances
after saving AspectRectangle
instances. In this case we can
define a Converter for AspectRectangle
that converts instances
to Rectangle
and defers to the RectangleConverter
.
class AspectRectangle(Rectangle):
def __init__(self, height, ratio):
self.height = height
self.ratio = ratio
def get_area(self):
width = self.height * self.ratio
return width * self.height
class AspectRectangleConverter(Converter):
tags = []
types = [AspectRectangle]
def select_tag(self, obj, tags, ctx):
return None # defer to a different Converter
def to_yaml_tree(self, obj, tag, ctx):
# convert the instance of AspectRectangle (obj) to
# a supported type (Rectangle)
return Rectangle(obj.height * obj.ratio, obj.height)
def from_yaml_tree(self, node, tag, ctx):
raise NotImplementedError()
Just like a non-deferring Converter this Converter will need to be added to an Extension and registered with asdf.
Reference cycles
Special considerations must be made when deserializing a tagged object that
contains a reference to itself among its descendants. Consider a
fractions.Fraction
subclass that maintains a reference to its multiplicative
inverse:
# in the example_project.fractions module
class FractionWithInverse(fractions.Fraction):
def __init__(self, *args, **kwargs):
self._inverse = None
@property
def inverse(self):
return self._inverse
@inverse.setter
def inverse(self, value):
self._inverse = value
The inverse of the inverse of a fraction is the fraction itself, we might wish to construct the objects in the following way:
f1 = FractionWithInverse(3, 5)
f2 = FractionWithInverse(5, 3)
f1.inverse = f2
f2.inverse = f1
Which creates an “infinite loop” between the two fractions. An ordinary
Converter wouldn’t be able to deserialize this, since each fraction
requires that the other be deserialized first! Let’s see what happens
when we define our from_yaml_tree
method in a naive way:
class FractionWithInverseConverter(Converter):
tags = ["asdf://example.com/fractions/tags/fraction-1.0.0"]
types = ["example_project.fractions.FractionWithInverse"]
def to_yaml_tree(self, obj, tag, ctx):
return {
"numerator": obj.width,
"denominator": obj.height,
"inverse": obj.inverse,
}
def from_yaml_tree(self, node, tag, ctx):
from example_project.fractions import FractionWithInverse
obj = FractionWithInverse(tree["numerator"], tree["denominator"])
obj.inverse = tree["inverse"]
return obj
After adding this Converter to an Extension and installing it, the fraction will serialize correctly:
with asdf.AsdfFile({"fraction": f1}) as af:
af.write_to("with_inverse.asdf")
But upon deserialization, we notice a problem:
with asdf.open("with_inverse.asdf") as af:
reconstituted_f1 = af["fraction"]
assert reconstituted_f1.inverse.inverse is asdf.treeutil.PendingValue
The presence of _PendingValue
is asdf’s way of telling us
that the value corresponding to the key inverse
was not fully deserialized
at the time that we retrieved it. We can handle this situation by making our
from_yaml_tree
a generator function:
def from_yaml_tree(self, node, tag, ctx):
from example_project.fractions import FractionWithInverse
obj = FractionWithInverse(tree["numerator"], tree["denominator"])
yield obj
obj.inverse = tree["inverse"]
The generator version of from_yaml_tree
yields the partially constructed
FractionWithInverse
object before setting its inverse property. This allows
asdf
to proceed to constructing the inverse FractionWithInverse
object,
and resume the original from_yaml_tree
execution only when the inverse
is actually available.
With this modification we can successfully deserialize our ASDF file:
with asdf.open("with_inverse.asdf") as af:
reconstituted_f1 = ff["fraction"]
assert reconstituted_f1.inverse.inverse is reconstituted_f1
Block storage
As described above Converters can return complex objects that will be passed to other Converters. If a Converter returns a ndarray, asdf will recognize this array and store it in an ASDF block. This is the easiest and preferred means of storing data in ASDF blocks.
For applications that require more flexibility,
Converters can control block storage through use of the asdf.extension.SerializationContext
provided as an argument to Converter.to_yaml_tree
Converter.from_yaml_tree
and Converter.select_tag
.
It is helpful to first review some details of how asdf
stores block. Blocks are stored sequentially within a
ASDF file following the YAML tree. During reads and writes, asdf will need to know
the index of the block a Converter would like to use to read or write the correct
block. However, the index used for reading might not be the same index for writing
if the tree was modified or the file is being written to a new location. During
serialization and deserialization, asdf will associate each object with the
accessed block during Converter.from_yaml_tree
and Converter.to_yaml_tree
.
Note
Converters using multiple blocks are slightly more complicated. See: Converters using multiple blocks
A simple example of a Converter using block storage to store the payload
for
BlockData
object instances is as follows:
import asdf
import numpy as np
from asdf.extension import Converter, Extension
class BlockData:
def __init__(self, payload):
self.payload = payload
class BlockConverter(Converter):
tags = ["asdf://somewhere.org/tags/block_data-1.0.0"]
types = [BlockData]
def to_yaml_tree(self, obj, tag, ctx):
block_index = ctx.find_available_block_index(
lambda: np.ndarray(len(obj.payload), dtype="uint8", buffer=obj.payload),
)
return {"block_index": block_index}
def from_yaml_tree(self, node, tag, ctx):
block_index = node["block_index"]
data_callback = ctx.get_block_data_callback(block_index)
obj = BlockData(data_callback())
return obj
class BlockExtension(Extension):
tags = ["asdf://somewhere.org/tags/block_data-1.0.0"]
converters = [BlockConverter()]
extension_uri = "asdf://somewhere.org/extensions/block_data-1.0.0"
with asdf.config_context() as cfg:
cfg.add_extension(BlockExtension())
ff = asdf.AsdfFile({"example": BlockData(b"abcdefg")})
ff.write_to("block_converter_example.asdf")
block_converter_example.asdf
#ASDF 1.0.0
#ASDF_STANDARD 1.5.0
%YAML 1.1
%TAG ! tag:stsci.edu:asdf/
--- !core/asdf-1.1.0
asdf_library: !core/software-1.0.0 {author: The ASDF Developers, homepage: 'http://github.com/asdf-format/asdf',
name: asdf, version: 3.4.1.dev0+g01e9a52a.d20240804}
history:
extensions:
- !core/extension_metadata-1.0.0
extension_class: asdf.extension._manifest.ManifestExtension
extension_uri: asdf://asdf-format.org/core/extensions/core-1.5.0
manifest_software: !core/software-1.0.0 {name: asdf_standard, version: 1.1.1}
software: !core/software-1.0.0 {name: asdf, version: 3.4.1.dev0+g01e9a52a.d20240804}
- !core/extension_metadata-1.0.0 {extension_class: builtins.BlockExtension, extension_uri: 'asdf://somewhere.org/extensions/block_data-1.0.0'}
example: !<asdf://somewhere.org/tags/block_data-1.0.0> {block_index: 0}
...
BLOCK 0:
allocated_size: 7
used_size: 7
data_size: 7
data: b'61626364656667'
#ASDF BLOCK INDEX
%YAML 1.1
---
- 842
...
During read, Converter.from_yaml_tree
will be called. Within this method
the Converter can prepare to access a block by calling
SerializationContext.get_block_data_callback
. This will return a function
that when called will return the contents of the block (to support lazy
loading without keeping a reference to the SerializationContext
(which is meant
to be a short lived and lightweight object).
During write, Converter.to_yaml_tree
will be called. The Converter can
use SerializationContext.find_available_block_index
to find the location of an
available block for writing. The data to be written to the block can be provided
as an ndarray
or a callable function that will return a ndarray
(note that
it is possible this callable function will be called multiple times and the
developer should cache results from any non-repeatable sources).
Converters using multiple blocks
As discussed above, while serializing and deserializing objects that use
one block, asdf will watch which block is accessed by find_available_block_index
and get_block_data_callback
and associate the block with the converted object.
This association allows asdf to map read and write blocks during updates of ASDF
files. An object that uses multiple blocks must provide a unique key for each
block it uses. These keys are generated using SerializationContext.generate_block_key
and must be stored by the extension code. These keys must be resupplied to the converter
when writing an object that was read from an ASDF file.
import asdf
import numpy as np
from asdf.extension import Converter, Extension
class MultiBlockData:
def __init__(self, data):
self.data = data
self.keys = []
class MultiBlockConverter(Converter):
tags = ["asdf://somewhere.org/tags/multi_block_data-1.0.0"]
types = [MultiBlockData]
def to_yaml_tree(self, obj, tag, ctx):
if not len(obj.keys):
obj.keys = [ctx.generate_block_key() for _ in obj.data]
indices = [ctx.find_available_block_index(d, k) for d, k in zip(obj.data, obj.keys)]
return {
"indices": indices,
}
def from_yaml_tree(self, node, tag, ctx):
indices = node["indices"]
keys = [ctx.generate_block_key() for _ in indices]
cbs = [ctx.get_block_data_callback(i, k) for i, k in zip(indices, keys)]
obj = MultiBlockData([cb() for cb in cbs])
obj.keys = keys
return obj
class MultiBlockExtension(Extension):
tags = ["asdf://somewhere.org/tags/multi_block_data-1.0.0"]
converters = [MultiBlockConverter()]
extension_uri = "asdf://somewhere.org/extensions/multi_block_data-1.0.0"
with asdf.config_context() as cfg:
cfg.add_extension(MultiBlockExtension())
obj = MultiBlockData([np.arange(3, dtype="uint8") + i for i in range(3)])
ff = asdf.AsdfFile({"example": obj})
ff.write_to("multi_block_converter_example.asdf")
multi_block_converter_example.asdf
#ASDF 1.0.0
#ASDF_STANDARD 1.5.0
%YAML 1.1
%TAG ! tag:stsci.edu:asdf/
--- !core/asdf-1.1.0
asdf_library: !core/software-1.0.0 {author: The ASDF Developers, homepage: 'http://github.com/asdf-format/asdf',
name: asdf, version: 3.4.1.dev0+g01e9a52a.d20240804}
history:
extensions:
- !core/extension_metadata-1.0.0
extension_class: asdf.extension._manifest.ManifestExtension
extension_uri: asdf://asdf-format.org/core/extensions/core-1.5.0
manifest_software: !core/software-1.0.0 {name: asdf_standard, version: 1.1.1}
software: !core/software-1.0.0 {name: asdf, version: 3.4.1.dev0+g01e9a52a.d20240804}
- !core/extension_metadata-1.0.0 {extension_class: builtins.MultiBlockExtension,
extension_uri: 'asdf://somewhere.org/extensions/multi_block_data-1.0.0'}
example: !<asdf://somewhere.org/tags/multi_block_data-1.0.0>
indices: [0, 1, 2]
...
BLOCK 0:
allocated_size: 3
used_size: 3
data_size: 3
data: b'000102'
BLOCK 1:
allocated_size: 3
used_size: 3
data_size: 3
data: b'010203'
BLOCK 2:
allocated_size: 3
used_size: 3
data_size: 3
data: b'020304'
#ASDF BLOCK INDEX
%YAML 1.1
---
- 867
- 924
- 981
...
Entry point performance considerations
For the good of asdf
users everywhere, it’s important that entry point
methods load as quickly as possible. All extensions must be loaded before
reading an ASDF file, and therefore all converters are created as well. Any
converter module or __init__
method that lingers will introduce a delay
to the initial call to asdf.open
. For that reason, we recommend that converter
authors minimize the number of imports that occur in the module containing the
Converter implementation, and defer imports of serializable types to within the
from_yaml_tree
method. This will prevent the type from ever being imported
when reading ASDF files that do not contain the associated tag.