This is from a proposal the EU commission published today (press release).
Among other things, the commission wants to extend the definition of CSAM to include not just realistic images, but also “reproductions or representations of a child engaged in sexually explicit conduct or realistic images of the sexual organs of a child, for primarily sexual purposes”.
Research has shown that limiting the dissemination of child sexual abuse material is
not only crucial to avoid the re-victimisation linked to the circulation of images and
videos of the abuse but is also essential as a form of offender-side prevention, as
accessing child sexual abuse material is often the first step towards hands-on abuse,
regardless of whether it depicts real or simply realistic abuse and exploitation. The
ongoing development of artificial intelligence applications capable of creating realistic
images that are indistinguishable from real images, the number of so-called ‘deep-
fake’ images and videos depicting child sexual abuse is expected to grow
exponentially in the coming years. In addition, the development of augmented,
extended and virtual reality settings making use of avatars including sensory feedback,
e.g. through devices providing a perception of touch are not fully covered by the existing
definition. The inclusion of an explicit reference to ‘reproductions and
representations’ should ensure that the definition of child sexual abuse material covers
these and future technological developments in a sufficiently technology-neutral and
hence future-proof way.
They don’t mention child dolls explicitly, but the way I see it, it would definitely fall under this definition.