Object Mapping
本文还介绍了 MappingMongoConverter 的特性,包括基础知识、如何使用约定将对象映射到文档,以及如何使用基于注释的映射元数据覆盖这些约定。
MappingMongoConverter
提供了丰富的映射支持。转换器包含一个元数据模型,该元数据模型提供了一个完整的功能集,可将域对象映射到 MongoDB 文档。通过在域对象上使用注释填充映射元数据模型。然而,该基础架构并不仅限于使用注释作为元数据信息的唯一来源。MappingMongoConverter
还允许在不提供任何附加元数据的情况下将对象映射到文档,方法是遵循一系列约定。
Rich mapping support is provided by the MappingMongoConverter
.
The converter holds a metadata model that provides a full feature set to map domain objects to MongoDB documents.
The mapping metadata model is populated by using annotations on your domain objects.
However, the infrastructure is not limited to using annotations as the only source of metadata information.
The MappingMongoConverter
also lets you map objects to documents without providing any additional metadata, by following a set of conventions.
本节介绍 MappingMongoConverter
的特性,包括基础知识、如何使用约定将对象映射到文档,以及如何使用基于注释的映射元数据覆盖这些约定。
This section describes the features of the MappingMongoConverter
, including fundamentals, how to use conventions for mapping objects to documents and how to override those conventions with annotation-based mapping metadata.
Object Mapping Fundamentals
本节涵盖了 Spring Data 对象映射、对象创建、字段和属性访问、可变性和不可变性的基础知识。请注意,此部分只适用于未使用基础数据存储的对象映射(如 JPA)的 Spring Data 模块。还要务必查阅特定于存储的部分,了解特定于存储的对象映射,如索引、自定义列或字段名称等。
This section covers the fundamentals of Spring Data object mapping, object creation, field and property access, mutability and immutability. Note, that this section only applies to Spring Data modules that do not use the object mapping of the underlying data store (like JPA). Also be sure to consult the store-specific sections for store-specific object mapping, like indexes, customizing column or field names or the like.
Spring Data 对象映射的核心职责是创建域对象实例并将存储本地数据结构映射到这些实例上。这意味着我们需要两个基本步骤:
Core responsibility of the Spring Data object mapping is to create instances of domain objects and map the store-native data structures onto those. This means we need two fundamental steps:
-
Instance creation by using one of the constructors exposed.
-
Instance population to materialize all exposed properties.
Object creation
Spring Data 自动尝试检测一个持久实体的构造函数,以便用于实现该类型的对象。解决算法的工作方式如下:
Spring Data automatically tries to detect a persistent entity’s constructor to be used to materialize objects of that type. The resolution algorithm works as follows:
-
If there is a single static factory method annotated with
@PersistenceCreator
then it is used. -
If there is a single constructor, it is used.
-
If there are multiple constructors and exactly one is annotated with
@PersistenceCreator
, it is used. -
If the type is a Java
Record
the canonical constructor is used. -
If there’s a no-argument constructor, it is used. Other constructors will be ignored.
值解析假定构造函数/工厂方法的参数名称与实体的属性名称匹配,即解析将执行得好像要填充该属性一样,包括映射中的所有定制(不同的数据存储列或字段名称等)。这也要求在类文件或构造函数上有 @ConstructorProperties
注释中提供参数名称信息。
The value resolution assumes constructor/factory method argument names to match the property names of the entity, i.e. the resolution will be performed as if the property was to be populated, including all customizations in mapping (different datastore column or field name etc.).
This also requires either parameter names information available in the class file or an @ConstructorProperties
annotation being present on the constructor.
可以通过使用 Spring Framework 的 @Value
值注释来使用特定于存储的 SpEL 表达式来定制值解析。请查阅有关具体存储的映射的部分以获取更多详细信息。
The value resolution can be customized by using Spring Framework’s @Value
value annotation using a store-specific SpEL expression.
Please consult the section on store specific mappings for further details.
为了避免反射时间开销,Spring Data 对象创建在默认情况下使用在运行时生成的工厂类,该类将直接调用领域类的构造函数。对于以下示例类型:
To avoid the overhead of reflection, Spring Data object creation uses a factory class generated at runtime by default, which will call the domain classes constructor directly. I.e. for this example type:
class Person {
Person(String firstname, String lastname) { … }
}
我们将在运行时为这个类创建一个语义上等价的工厂类:
we will create a factory class semantically equivalent to this one at runtime:
class PersonObjectInstantiator implements ObjectInstantiator {
Object newInstance(Object... args) {
return new Person((String) args[0], (String) args[1]);
}
}
这给了我们超过反射 10% 的迂回式性能提升。为了使领域类适用于这种优化,它需要遵守一组约束:
This gives us a roundabout 10% performance boost over reflection. For the domain class to be eligible for such optimization, it needs to adhere to a set of constraints:
-
it must not be a private class
-
it must not be a non-static inner class
-
it must not be a CGLib proxy class
-
the constructor to be used by Spring Data must not be private
如果符合上述任何一个标准,Spring Data 将回退到通过反射进行实体实例化。
If any of these criteria match, Spring Data will fall back to entity instantiation via reflection.
Property population
在创建了该实体的一个实例之后,Spring Data 填充该类的所有剩余的持久属性。除非已通过实体的构造函数填充(即通过其构造函数参数列表消耗),否则将首先填充标识符属性,以允许解析循环对象引用。在此之后,所有尚未被构造函数填充的非瞬态属性都会在实体实例上设置。为此,我们使用以下算法:
Once an instance of the entity has been created, Spring Data populates all remaining persistent properties of that class. Unless already populated by the entity’s constructor (i.e. consumed through its constructor argument list), the identifier property will be populated first to allow the resolution of cyclic object references. After that, all non-transient properties that have not already been populated by the constructor are set on the entity instance. For that we use the following algorithm:
-
If the property is immutable but exposes a
with…
method (see below), we use thewith…
method to create a new entity instance with the new property value. -
If property access (i.e. access through getters and setters) is defined, we’re invoking the setter method.
-
If the property is mutable we set the field directly.
-
If the property is immutable we’re using the constructor to be used by persistence operations (see mapping.object-creation) to create a copy of the instance.
-
By default, we set the field value directly.
类似于我们在对象构造中的优化,映射.object-creation.details,我们还使用 Spring Data 运行时生成的访问器类与实体实例进行交互。
Similarly to our mapping.object-creation.details we also use Spring Data runtime generated accessor classes to interact with the entity instance.
class Person {
private final Long id;
private String firstname;
private @AccessType(Type.PROPERTY) String lastname;
Person() {
this.id = null;
}
Person(Long id, String firstname, String lastname) {
// Field assignments
}
Person withId(Long id) {
return new Person(id, this.firstname, this.lastame);
}
void setLastname(String lastname) {
this.lastname = lastname;
}
}
class PersonPropertyAccessor implements PersistentPropertyAccessor {
private static final MethodHandle firstname; 2
private Person person; 1
public void setProperty(PersistentProperty property, Object value) {
String name = property.getName();
if ("firstname".equals(name)) {
firstname.invoke(person, (String) value); 2
} else if ("id".equals(name)) {
this.person = person.withId((Long) value); 3
} else if ("lastname".equals(name)) {
this.person.setLastname((String) value); 4
}
}
}
1 | PropertyAccessor’s hold a mutable instance of the underlying object. This is, to enable mutations of otherwise immutable properties. |
2 | By default, Spring Data uses field-access to read and write property values. As per visibility rules of private fields, MethodHandles are used to interact with fields. |
3 | The class exposes a withId(…) method that’s used to set the identifier, e.g. when an instance is inserted into the datastore and an identifier has been generated. Calling withId(…) creates a new Person object. All subsequent mutations will take place in the new instance leaving the previous untouched. |
4 | Using property-access allows direct method invocations without using MethodHandles . |
这给了我们超过反射 25% 的迂回式性能提升。为了使领域类适用于这种优化,它需要遵守一组约束:
This gives us a roundabout 25% performance boost over reflection. For the domain class to be eligible for such optimization, it needs to adhere to a set of constraints:
-
Types must not reside in the default or under the
java
package. -
Types and their constructors must be
public
-
Types that are inner classes must be
static
. -
The used Java Runtime must allow for declaring classes in the originating
ClassLoader
. Java 9 and newer impose certain limitations.
在默认情况下,Spring Data 尝试使用生成的属性访问器,如果检测到限制,则回退到基于反射的属性访问器。
By default, Spring Data attempts to use generated property accessors and falls back to reflection-based ones if a limitation is detected.
让我们看看以下实体:
Let’s have a look at the following entity:
class Person {
private final @Id Long id; 1
private final String firstname, lastname; 2
private final LocalDate birthday;
private final int age; 3
private String comment; 4
private @AccessType(Type.PROPERTY) String remarks; 5
static Person of(String firstname, String lastname, LocalDate birthday) { 6
return new Person(null, firstname, lastname, birthday,
Period.between(birthday, LocalDate.now()).getYears());
}
Person(Long id, String firstname, String lastname, LocalDate birthday, int age) { 6
this.id = id;
this.firstname = firstname;
this.lastname = lastname;
this.birthday = birthday;
this.age = age;
}
Person withId(Long id) { 1
return new Person(id, this.firstname, this.lastname, this.birthday, this.age);
}
void setRemarks(String remarks) { 5
this.remarks = remarks;
}
}
1 | The identifier property is final but set to null in the constructor.
The class exposes a withId(…) method that’s used to set the identifier, e.g. when an instance is inserted into the datastore and an identifier has been generated.
The original Person instance stays unchanged as a new one is created.
The same pattern is usually applied for other properties that are store managed but might have to be changed for persistence operations.
The wither method is optional as the persistence constructor (see 6) is effectively a copy constructor and setting the property will be translated into creating a fresh instance with the new identifier value applied. |
2 | The firstname and lastname properties are ordinary immutable properties potentially exposed through getters. |
3 | The age property is an immutable but derived one from the birthday property.
With the design shown, the database value will trump the defaulting as Spring Data uses the only declared constructor.
Even if the intent is that the calculation should be preferred, it’s important that this constructor also takes age as parameter (to potentially ignore it) as otherwise the property population step will attempt to set the age field and fail due to it being immutable and no with… method being present. |
4 | The comment property is mutable and is populated by setting its field directly. |
5 | The remarks property is mutable and is populated by invoking the setter method. |
6 | The class exposes a factory method and a constructor for object creation.
The core idea here is to use factory methods instead of additional constructors to avoid the need for constructor disambiguation through @PersistenceCreator .
Instead, defaulting of properties is handled within the factory method.
If you want Spring Data to use the factory method for object instantiation, annotate it with @PersistenceCreator . |
General recommendations
-
Try to stick to immutable objects — Immutable objects are straightforward to create as materializing an object is then a matter of calling its constructor only. Also, this avoids your domain objects to be littered with setter methods that allow client code to manipulate the objects state. If you need those, prefer to make them package protected so that they can only be invoked by a limited amount of co-located types. Constructor-only materialization is up to 30% faster than properties population.
-
Provide an all-args constructor — Even if you cannot or don’t want to model your entities as immutable values, there’s still value in providing a constructor that takes all properties of the entity as arguments, including the mutable ones, as this allows the object mapping to skip the property population for optimal performance.
-
Use factory methods instead of overloaded constructors to avoid `@PersistenceCreator` — With an all-argument constructor needed for optimal performance, we usually want to expose more application use case specific constructors that omit things like auto-generated identifiers etc. It’s an established pattern to rather use static factory methods to expose these variants of the all-args constructor.
-
Make sure you adhere to the constraints that allow the generated instantiator and property accessor classes to be used —
-
For identifiers to be generated, still use a final field in combination with an all-arguments persistence constructor (preferred) or a
with…
method — -
Use Lombok to avoid boilerplate code — As persistence operations usually require a constructor taking all arguments, their declaration becomes a tedious repetition of boilerplate parameter to field assignments that can best be avoided by using Lombok’s
@AllArgsConstructor
.
Overriding Properties
Java 允许对领域类进行灵活的设计,其中子类可以定义在超类中已经使用相同名称声明的属性。考虑以下示例:
Java’s allows a flexible design of domain classes where a subclass could define a property that is already declared with the same name in its superclass. Consider the following example:
public class SuperType {
private CharSequence field;
public SuperType(CharSequence field) {
this.field = field;
}
public CharSequence getField() {
return this.field;
}
public void setField(CharSequence field) {
this.field = field;
}
}
public class SubType extends SuperType {
private String field;
public SubType(String field) {
super(field);
this.field = field;
}
@Override
public String getField() {
return this.field;
}
public void setField(String field) {
this.field = field;
// optional
super.setField(field);
}
}
这两个类都使用可分配的类型定义了 field
。但是,SubType
隐藏了 SuperType.field
。根据类设计,使用构造函数可能是设置 SuperType.field
的唯一默认方式。或者,在设置器中调用 super.setField(…)
可以设置 SuperType
中的 field
。由于这些属性共享相同名称,但可能表示两个不同的值,所有这些机制在某种程度上都会产生冲突。如果类型不可分配,Spring Data 将跳过超类型属性。也就是说,被重写的属性的类型必须可以分配给其超类型属性类型才能被注册为覆盖项,否则超类型属性被视为瞬态。我们通常建议使用不同的属性名称。
Both classes define a field
using assignable types. SubType
however shadows SuperType.field
.
Depending on the class design, using the constructor could be the only default approach to set SuperType.field
.
Alternatively, calling super.setField(…)
in the setter could set the field
in SuperType
.
All these mechanisms create conflicts to some degree because the properties share the same name yet might represent two distinct values.
Spring Data skips super-type properties if types are not assignable.
That is, the type of the overridden property must be assignable to its super-type property type to be registered as override, otherwise the super-type property is considered transient.
We generally recommend using distinct property names.
Spring Data 模块普遍支持包含不同值的重写属性。从编程模型的角度来看,有几件事需要考虑:
Spring Data modules generally support overridden properties holding different values. From a programming model perspective there are a few things to consider:
-
Which property should be persisted (default to all declared properties)? You can exclude properties by annotating these with
@Transient
. -
How to represent properties in your data store? Using the same field/column name for different values typically leads to corrupt data so you should annotate least one of the properties using an explicit field/column name.
-
Using
@AccessType(PROPERTY)
cannot be used as the super-property cannot be generally set without making any further assumptions of the setter implementation.
Kotlin support
Spring Data 适应 Kotlin 的特殊性,允许创建和更改对象。
Spring Data adapts specifics of Kotlin to allow object creation and mutation.
Kotlin object creation
支持实例化 Kotlin 类,所有类在默认情况下都是不可变的,需要显式属性声明才能定义可变属性。
Kotlin classes are supported to be instantiated, all classes are immutable by default and require explicit property declarations to define mutable properties.
Spring Data 自动尝试检测一个持久实体的构造函数,以便用于实现该类型的对象。解决算法的工作方式如下:
Spring Data automatically tries to detect a persistent entity’s constructor to be used to materialize objects of that type. The resolution algorithm works as follows:
-
If there is a constructor that is annotated with
@PersistenceCreator
, it is used. -
If the type is a mapping.kotlin the primary constructor is used.
-
If there is a single static factory method annotated with
@PersistenceCreator
then it is used. -
If there is a single constructor, it is used.
-
If there are multiple constructors and exactly one is annotated with
@PersistenceCreator
, it is used. -
If the type is a Java
Record
the canonical constructor is used. -
If there’s a no-argument constructor, it is used. Other constructors will be ignored.
考虑以下 data
类 Person
:
Consider the following data
class Person
:
data class Person(val id: String, val name: String)
上述类编译为具有显式构造函数的典型类。我们可以通过添加另一个构造函数并用 @PersistenceCreator
注释它来定制此类,以指示构造函数偏好:
The class above compiles to a typical class with an explicit constructor.We can customize this class by adding another constructor and annotate it with @PersistenceCreator
to indicate a constructor preference:
data class Person(var id: String, val name: String) {
@PersistenceCreator
constructor(id: String) : this(id, "unknown")
}
Kotlin 支持参数可选择性,允许在不提供参数时使用默认值。当 Spring Data 检测到带有参数默认值的构造函数时,它会让这些参数保持缺失状态(或仅返回 null
),以便 Kotlin 可以应用参数默认值。考虑以下对 name
应用参数默认值的类
Kotlin supports parameter optionality by allowing default values to be used if a parameter is not provided.
When Spring Data detects a constructor with parameter defaulting, then it leaves these parameters absent if the data store does not provide a value (or simply returns null
) so Kotlin can apply parameter defaulting.Consider the following class that applies parameter defaulting for name
data class Person(var id: String, val name: String = "unknown")
每次当 name
参数既不属于结果的一部分或其值为 null
,则 name
将默认为 unknown
。
Every time the name
parameter is either not part of the result or its value is null
, then the name
defaults to unknown
.
Property population of Kotlin data classes
在 Kotlin 中,所有类默认情况下是不可变的,需要显式声明属性来定义可变属性。考虑下面的 data
类 Person
:
In Kotlin, all classes are immutable by default and require explicit property declarations to define mutable properties.
Consider the following data
class Person
:
data class Person(val id: String, val name: String)
这个类实际上是不可变的。它允许创建新的实例,因为 Kotlin 生成了一个 copy(…)
方法,该方法创建新的对象实例,复制所有属性值(从现有对象)并将作为参数提供给该方法的属性值应用进来。
This class is effectively immutable.
It allows creating new instances as Kotlin generates a copy(…)
method that creates new object instances copying all property values from the existing object and applying property values provided as arguments to the method.
Kotlin Overriding Properties
Kotlin 允许声明 `` 来更改子类中的属性。
Kotlin allows declaring property overrides to alter properties in subclasses.
open class SuperType(open var field: Int)
class SubType(override var field: Int = 1) :
SuperType(field) {
}
这种安排使两个名称为 field
的属性得以呈现。Kotlin 中为每个类中每个属性生成了属性访问器(getter 和 setter)。实际上,该代码如下所示:
Such an arrangement renders two properties with the name field
.
Kotlin generates property accessors (getters and setters) for each property in each class.
Effectively, the code looks like as follows:
public class SuperType {
private int field;
public SuperType(int field) {
this.field = field;
}
public int getField() {
return this.field;
}
public void setField(int field) {
this.field = field;
}
}
public final class SubType extends SuperType {
private int field;
public SubType(int field) {
super(field);
this.field = field;
}
public int getField() {
return this.field;
}
public void setField(int field) {
this.field = field;
}
}
SubType
上的 getter 和 setter 仅设置 SubType.field
而不设置 SuperType.field
。在这样的安排中,使用构造函数是设置 SuperType.field
的唯一默认方法。向 SubType
中添加一个方法以通过 this.SuperType.field = …
来设置 SuperType.field
是可能的,但是不属于受支持的约定。属性覆盖在某种程度上创建了冲突,因为这些属性共享相同名称,但可能表示两个不同的值。我们通常建议使用不同的属性名称。
Getters and setters on SubType
set only SubType.field
and not SuperType.field
.
In such an arrangement, using the constructor is the only default approach to set SuperType.field
.
Adding a method to SubType
to set SuperType.field
via this.SuperType.field = …
is possible but falls outside of supported conventions.
Property overrides create conflicts to some degree because the properties share the same name yet might represent two distinct values.
We generally recommend using distinct property names.
Spring Data 模块普遍支持包含不同值的重写属性。从编程模型的角度来看,有几件事需要考虑:
Spring Data modules generally support overridden properties holding different values. From a programming model perspective there are a few things to consider:
-
Which property should be persisted (default to all declared properties)? You can exclude properties by annotating these with
@Transient
. -
How to represent properties in your data store? Using the same field/column name for different values typically leads to corrupt data so you should annotate least one of the properties using an explicit field/column name.
-
Using
@AccessType(PROPERTY)
cannot be used as the super-property cannot be set.
Kotlin Value Classes
Kotlin 值类被设计用于更具表现力的领域模型来使基础概念明确化。Spring Data 可以读取和写入使用值类定义属性的类型。
Kotlin Value Classes are designed for a more expressive domain model to make underlying concepts explicit. Spring Data can read and write types that define properties using Value Classes.
考虑以下领域模型:
Consider the following domain model:
@JvmInline
value class EmailAddress(val theAddress: String) 1
data class Contact(val id: String, val name:String, val emailAddress: EmailAddress) 2
1 | A simple value class with a non-nullable value type. |
2 | Data class defining a property using the EmailAddress value class. |
在已编译类中,使用非基本值类型而非空属性会展平为该值类型。可空基本值类型或可空值中值类型将用包装器类型表示,并影响在数据库中如何表示值类型。 |
Non-nullable properties using non-primitive value types are flattened in the compiled class to the value type. Nullable primitive value types or nullable value-in-value types are represented with their wrapper type and that affects how value types are represented in the database. |
Convention-based Mapping
当没有提供其他映射元数据时,MappingMongoConverter
有几个映射对象到文档的约定。约定是:
MappingMongoConverter
has a few conventions for mapping objects to documents when no additional mapping metadata is provided.
The conventions are:
-
The short Java class name is mapped to the collection name in the following manner. The class
com.bigbank.SavingsAccount
maps to thesavingsAccount
collection name. -
All nested objects are stored as nested objects in the document and not as DBRefs.
-
The converter uses any Spring Converters registered with it to override the default mapping of object properties to document fields and values.
-
The fields of an object are used to convert to and from fields in the document. Public
JavaBean
properties are not used. -
If you have a single non-zero-argument constructor whose constructor argument names match top-level field names of document, that constructor is used.Otherwise, the zero-argument constructor is used.If there is more than one non-zero-argument constructor, an exception will be thrown.
How the _id
field is handled in the mapping layer.
MongoDB 要求所有文档都有一个 _id
字段。如果您没有提供,驱动程序将分配一个 ObjectId 具有生成的值。“_id”字段可以是除数组之外的任何类型,只要它是唯一的。驱动程序当然支持所有基本类型和日期。使用 MappingMongoConverter
时,存在某些规则来规制 Java 类的属性如何映射到此 _id
字段。
MongoDB requires that you have an _id
field for all documents.If you don’t provide one the driver will assign a ObjectId with a generated value.The "_id" field can be of any type the, other than arrays, so long as it is unique.The driver naturally supports all primitive types and Dates.When using the MappingMongoConverter
there are certain rules that govern how properties from the Java class is mapped to this _id
field.
以下概述了将映射到 _id
文档字段的字段:
The following outlines what field will be mapped to the _id
document field:
-
A field annotated with
@Id
(org.springframework.data.annotation.Id
) will be mapped to the_id
field. -
A field without an annotation but named
id
will be mapped to the_id
field. -
The default field name for identifiers is
_id
and can be customized via the@Field
annotation.
Field definition | Resulting Id-Fieldname in MongoDB |
---|---|
|
|
|
|
|
|
|
|
|
|
以下概述了在映射到 _id 文档字段的属性上执行的任何类型转换(如果存在)。
The following outlines what type conversion, if any, will be done on the property mapped to the _id document field.
-
If a field named
id
is declared as a String or BigInteger in the Java class it will be converted to and stored as an ObjectId if possible. ObjectId as a field type is also valid. If you specify a value forid
in your application, the conversion to an ObjectId is detected to the MongoDB driver. If the specifiedid
value cannot be converted to an ObjectId, then the value will be stored as is in the document’s _id field. This also applies if the field is annotated with@Id
. -
If a field is annotated with
@MongoId
in the Java class it will be converted to and stored as using its actual type. No further conversion happens unless@MongoId
declares a desired field type. If no value is provided for theid
field, a newObjectId
will be created and converted to the properties type. -
If a field is annotated with
@MongoId(FieldType.…)
in the Java class it will be attempted to convert the value to the declaredFieldType
. If no value is provided for theid
field, a newObjectId
will be created and converted to the declared type. -
If a field named
id
id field is not declared as a String, BigInteger, or ObjectID in the Java class then you should assign it a value in your application so it can be stored 'as-is' in the document’s _id field. -
If no field named
id
is present in the Java class then an implicit_id
file will be generated by the driver but not mapped to a property or field of the Java class.
在查询和更新时,MongoTemplate
将使用转换器来处理 Query
和 Update
对象的转换,这些对象对应于上述保存文档的规则,因此查询中使用的字段名和类型将能够匹配域类中的内容。
When querying and updating MongoTemplate
will use the converter to handle conversions of the Query
and Update
objects that correspond to the above rules for saving documents so field names and types used in your queries will be able to match what is in your domain classes.
Data Mapping and Type Conversion
Spring Data MongoDB 支持所有可以表示为 BSON(MongoDB 的内部文件格式)的类型。除了这些类型,Spring Data MongoDB 还提供了一组内置转换程序来映射其他类型。您可以提供您自己的转换程序来调整类型转换。详情请参阅 Custom Conversions - Overriding Default Mapping。
Spring Data MongoDB supports all types that can be represented as BSON, MongoDB’s internal document format. In addition to these types, Spring Data MongoDB provides a set of built-in converters to map additional types. You can provide your own converters to adjust type conversion. See Custom Conversions - Overriding Default Mapping for further details.
Type | Type conversion | Sample |
---|---|---|
|
native |
|
|
native |
|
|
native 32-bit integer |
|
|
native 64-bit integer |
|
|
native |
|
|
native |
|
|
native |
|
|
native |
|
|
native |
|
Array, |
native |
|
|
native |
|
|
native |
|
|
native |
|
|
native |
|
|
converter 32-bit integer |
|
|
converter 64-bit integer |
|
|
converter
|
|
|
converter
|
|
|
converter |
|
|
converter |
|
|
converter |
|
|
converter
|
|
|
converter |
|
|
native |
|
|
converter |
|
|
converter / native (Java8)Uses UTC zone offset. Configure via MongoConverterConfigurationAdapter |
|
|
converter / native (Java8)Uses UTC zone offset. Configure via MongoConverterConfigurationAdapter |
|
|
converter |
|
|
converter |
|
|
converter |
|
|
converter |
|
|
converter |
|
|
converter |
|
|
converter |
|
|
converter |
|
|
converter |
|
|
converter |
|
|
converter |
|
|
converter |
|
|
converter |
|
Collection Handling
集合处理取决于 MongoDB 返回的实际值。 Collection handling depends on the actual values returned by MongoDB.
通常,如果您使用构造函数创建,则可以获取要设置的值。如果查询响应未提供属性值,属性填充可以使用默认初始化值。 Generally, if you use constructor creation, then you can get hold of the value to be set. Property population can make use of default initialization values if a property value is not being provided by a query response. |
Mapping Configuration
除非显式配置,否则在您创建 MongoTemplate
时,默认情况下将创建一个 MappingMongoConverter
实例。您可以创建自己的 MappingMongoConverter
实例。这样做使您可以指示在类路径中的何处可以找到您的域类,以便 Spring Data MongoDB 可以提取元数据并构建索引。此外,通过创建自己的实例,您可以注册 Spring 转换器以将特定类映射到数据库并从数据库映射。
Unless explicitly configured, an instance of MappingMongoConverter
is created by default when you create a MongoTemplate
.
You can create your own instance of the MappingMongoConverter
.
Doing so lets you dictate where in the classpath your domain classes can be found, so that Spring Data MongoDB can extract metadata and construct indexes.
Also, by creating your own instance, you can register Spring converters to map specific classes to and from the database.
您可以使用基于 Java 或基于 XML 的元数据配置 MappingMongoConverter
以及 com.mongodb.client.MongoClient
和 MongoTemplate。以下示例显示配置:
You can configure the MappingMongoConverter
as well as com.mongodb.client.MongoClient
and MongoTemplate by using either Java-based or XML-based metadata.
The following example shows the configuration:
-
Java
@Configuration
public class MongoConfig extends AbstractMongoClientConfiguration {
@Override
public String getDatabaseName() {
return "database";
}
// the following are optional
@Override
public String getMappingBasePackage() { 1
return "com.bigbank.domain";
}
@Override
void configureConverters(MongoConverterConfigurationAdapter adapter) { 2
adapter.registerConverter(new org.springframework.data.mongodb.test.PersonReadConverter());
adapter.registerConverter(new org.springframework.data.mongodb.test.PersonWriteConverter());
}
@Bean
public LoggingEventListener<MongoMappingEvent> mappingEventsListener() {
return new LoggingEventListener<MongoMappingEvent>();
}
}
1 | The mapping base package defines the root path used to scan for entities used to pre initialize the MappingContext .
By default the configuration classes package is used. |
2 | Configure additional custom converters for specific domain types that replace the default mapping procedure for those types with your custom implementation. |
- XML
-
<?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:mongo="http://www.springframework.org/schema/data/mongo" xsi:schemaLocation=" http://www.springframework.org/schema/data/mongo https://www.springframework.org/schema/data/mongo/spring-mongo.xsd http://www.springframework.org/schema/beans https://www.springframework.org/schema/beans/spring-beans-3.0.xsd"> <!-- Default bean name is 'mongo' --> <mongo:mongo-client host="localhost" port="27017"/> <mongo:db-factory dbname="database" mongo-ref="mongoClient"/> <!-- by default look for a Mongo object named 'mongo' - default name used for the converter is 'mappingConverter' --> <mongo:mapping-converter base-package="com.bigbank.domain"> <mongo:custom-converters> <mongo:converter ref="readConverter"/> <mongo:converter> <bean class="org.springframework.data.mongodb.test.PersonWriteConverter"/> </mongo:converter> </mongo:custom-converters> </mongo:mapping-converter> <bean id="readConverter" class="org.springframework.data.mongodb.test.PersonReadConverter"/> <!-- set the mapping converter to be used by the MongoTemplate --> <bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate"> <constructor-arg name="mongoDbFactory" ref="mongoDbFactory"/> <constructor-arg name="mongoConverter" ref="mappingConverter"/> </bean> <bean class="org.springframework.data.mongodb.core.mapping.event.LoggingEventListener"/> </beans>
AbstractMongoClientConfiguration
要求您实现定义 com.mongodb.client.MongoClient
的方法并提供数据库名称。AbstractMongoClientConfiguration
还具有一个名为 getMappingBasePackage(…)
的方法,您可以覆盖该方法以告诉转换器从哪里扫描注释有 @Document
注释的类。
AbstractMongoClientConfiguration
requires you to implement methods that define a com.mongodb.client.MongoClient
as well as provide a database name.
AbstractMongoClientConfiguration
also has a method named getMappingBasePackage(…)
that you can override to tell the converter where to scan for classes annotated with the @Document
annotation.
您可以通过覆盖 customConversionsConfiguration
方法将其他转换器添加到转换器。MongoDB 的原生 JSR-310 支持可以通过 MongoConverterConfigurationAdapter.useNativeDriverJavaTimeCodecs()
启用。前一个示例中还展示了一个 LoggingEventListener
,它记录发布到 Spring 的 ApplicationContextEvent
基础设施中的 MongoMappingEvent
实例。
You can add additional converters to the converter by overriding the customConversionsConfiguration
method.
MongoDB’s native JSR-310 support can be enabled through MongoConverterConfigurationAdapter.useNativeDriverJavaTimeCodecs()
.
Also shown in the preceding example is a LoggingEventListener
, which logs MongoMappingEvent
instances that are posted onto Spring’s ApplicationContextEvent
infrastructure.
Java Time Types
我们建议使用上面描述的 Java Time Types
We recommend using MongoDB’s native JSR-310 support via |
|
|
base-package
属性告诉它从哪里扫描注释有 @org.springframework.data.mongodb.core.mapping.Document
注释的类。
The base-package
property tells it where to scan for classes annotated with the @org.springframework.data.mongodb.core.mapping.Document
annotation.
如果您想依靠 Spring Boot来引导 Data MongoDB,但仍想覆盖配置的特定方面,您可能希望公开该类型的 Bean。对于自定义转换,您可能选择注册一个 If you want to rely on Spring Boot to bootstrap Data MongoDB, but still want to override certain aspects of the configuration, you may want to expose beans of that type.
For custom conversions you may eg. choose to register a bean of type |
Metadata-based Mapping
为了充分利用 Spring Data MongoDB 支持中的对象映射功能,您应该使用 @Document
注释注释您的映射对象。虽然映射框架不需要此注释(即使没有任何注释,您的 POJO 也被正确映射),但它可以让类路径扫描器找到并预处理您的域对象以提取必要的元数据。如果您不使用此注释,那么在您第一次存储域对象时,您的应用程序会受到轻微的性能影响,因为映射框架需要构建其内部元数据模型,以便了解域对象的属性以及如何持久化它们。以下示例展示了一个域对象:
To take full advantage of the object mapping functionality inside the Spring Data MongoDB support, you should annotate your mapped objects with the @Document
annotation.
Although it is not necessary for the mapping framework to have this annotation (your POJOs are mapped correctly, even without any annotations), it lets the classpath scanner find and pre-process your domain objects to extract the necessary metadata.
If you do not use this annotation, your application takes a slight performance hit the first time you store a domain object, because the mapping framework needs to build up its internal metadata model so that it knows about the properties of your domain object and how to persist them.
The following example shows a domain object:
package com.mycompany.domain;
@Document
public class Person {
@Id
private ObjectId id;
@Indexed
private Integer ssn;
private String firstName;
@Indexed
private String lastName;
}
@Id
注释告诉映射器您想要用于 MongoDB _id
属性的属性,而 @Indexed
注释告诉映射框架调用该文档的 createIndex(…)
属性,从而使搜索速度更快。仅对带 @Document
注释的类型执行自动索引创建。
The @Id
annotation tells the mapper which property you want to use for the MongoDB _id
property, and the @Indexed
annotation tells the mapping framework to call createIndex(…)
on that property of your document, making searches faster.
Automatic index creation is only done for types annotated with @Document
.
默认情况下,自动索引创建是*disabled*,需要通过配置启用(请参阅Index Creation)。
Auto index creation is disabled by default and needs to be enabled through the configuration (see Index Creation).
Mapping Annotation Overview
MappingMongoConverter 可以使用元数据来驱动对象到文档的映射。提供以下注释:
The MappingMongoConverter can use metadata to drive the mapping of objects to documents. The following annotations are available:
-
@Id
: Applied at the field level to mark the field used for identity purpose. -
@MongoId
: Applied at the field level to mark the field used for identity purpose. Accepts an optionalFieldType
to customize id conversion. -
@Document
: Applied at the class level to indicate this class is a candidate for mapping to the database. You can specify the name of the collection where the data will be stored. -
@DBRef
: Applied at the field to indicate it is to be stored using a com.mongodb.DBRef. -
@DocumentReference
: Applied at the field to indicate it is to be stored as a pointer to another document. This can be a single value (the id by default), or aDocument
provided via a converter. -
@Indexed
: Applied at the field level to describe how to index the field. -
@CompoundIndex
(repeatable): Applied at the type level to declare Compound Indexes. -
@GeoSpatialIndexed
: Applied at the field level to describe how to geoindex the field. -
@TextIndexed
: Applied at the field level to mark the field to be included in the text index. -
@HashIndexed
: Applied at the field level for usage within a hashed index to partition data across a sharded cluster. -
@Language
: Applied at the field level to set the language override property for text index. -
@Transient
: By default, all fields are mapped to the document. This annotation excludes the field where it is applied from being stored in the database. Transient properties cannot be used within a persistence constructor as the converter cannot materialize a value for the constructor argument. -
@PersistenceConstructor
: Marks a given constructor - even a package protected one - to use when instantiating the object from the database. Constructor arguments are mapped by name to the key values in the retrieved Document. -
@Value
: This annotation is part of the Spring Framework . Within the mapping framework it can be applied to constructor arguments. This lets you use a Spring Expression Language statement to transform a key’s value retrieved in the database before it is used to construct a domain object. In order to reference a property of a given document one has to use expressions like:@Value("#root.myProperty")
whereroot
refers to the root of the given document. -
@Field
: Applied at the field level it allows to describe the name and type of the field as it will be represented in the MongoDB BSON document thus allowing the name and type to be different than the fieldname of the class as well as the property type. -
@Version
: Applied at field level is used for optimistic locking and checked for modification on save operations. The initial value iszero
(one
for primitive types) which is bumped automatically on every update.
映射元数据基础设施在一个与技术无关的单独 spring-data-commons 项目中定义。特定的子类在 MongoDB 支持中使用以支持基于注释的元数据。如果需要,还可以实施其他策略。
The mapping metadata infrastructure is defined in a separate spring-data-commons project that is technology agnostic. Specific subclasses are using in the MongoDB support to support annotation based metadata. Other strategies are also possible to put in place if there is demand.
@Document
@CompoundIndex(name = "age_idx", def = "{'lastName': 1, 'age': -1}")
public class Person<T extends Address> {
@Id
private String id;
@Indexed(unique = true)
private Integer ssn;
@Field("fName")
private String firstName;
@Indexed
private String lastName;
private Integer age;
@Transient
private Integer accountTotal;
@DBRef
private List<Account> accounts;
private T address;
public Person(Integer ssn) {
this.ssn = ssn;
}
@PersistenceConstructor
public Person(Integer ssn, String firstName, String lastName, Integer age, T address) {
this.ssn = ssn;
this.firstName = firstName;
this.lastName = lastName;
this.age = age;
this.address = address;
}
public String getId() {
return id;
}
// no setter for Id. (getter is only exposed for some unit testing)
public Integer getSsn() {
return ssn;
}
// other getters/setters omitted
}
当映射基础架构推断出的原生 MongoDB 类型与预期的不匹配时,
您甚至可以考虑自己定制的批注。 You may even consider your own, custom annotation.
|
Special Field Names
一般而言,MongoDB 使用点号 (.
) 字符作为嵌套文档或数组的路径分隔符。这意味着,在查询(或更新声明)中,像 a.b.c
这样的键针对的是如下所示的对象结构:
Generally speaking MongoDB uses the dot (.
) character as a path separator for nested documents or arrays.
This means that in a query (or update statement) a key like a.b.c
targets an object structure as outlined below:
{
'a' : {
'b' : {
'c' : …
}
}
}
因此,直到 MongoDB 5.0,字段名不能包含点号 (.
)。使用 MappingMongoConverter#setMapKeyDotReplacement
允许在写时用另一个字符替换点号,以此规避存储 Map
结构时的一些限制。
Therefore, up until MongoDB 5.0 field names must not contain dots (.
).
Using a MappingMongoConverter#setMapKeyDotReplacement
allowed circumvent some of the limitations when storing Map
structures by substituting dots on write with another character.
converter.setMapKeyDotReplacement("-");
// ...
source.map = Map.of("key.with.dot", "value")
converter.write(source,...) // -> map : { 'key-with-dot', 'value' }
随着 MongoDB 5.0 的发布,对包含特殊字符的 Document
字段名的限制已经解除。我们强烈推荐阅读 MongoDB Reference 中有关使用字段名中的点的限制的更多信息。要在 Map
结构中允许使用点,请在 MappingMongoConverter
上设置 preserveMapKeys
。
With the release of MongoDB 5.0 this restriction on Document
field names containing special characters was lifted.
We highly recommend reading more about limitations on using dots in field names in the MongoDB Reference.
To allow dots in Map
structures please set preserveMapKeys
on the MappingMongoConverter
.
使用 @Field
允许通过两种方式自定义字段名以考虑点号。
Using @Field
allows customizing the field name to consider dots in two ways.
-
@Field(name = "a.b")
: The name is considered to be a path. Operations expect a structure of nested objects such as{ a : { b : … } }
. -
@Field(name = "a.b", fieldNameType = KEY)
: The names is considered a name as-is. Operations expect a field with the given value as{ 'a.b' : ….. }
由于点号在 MongoDB 查询和更新声明中具有特殊性质,包含点号的字段名无法直接作为目标,因此被排除在派生查询方法中。考虑下面的 Item
具有一个映射到名为 cat.id
的字段的 categoryId
属性。
Due to the special nature of the dot character in both MongoDB query and update statements field names containing dots cannot be targeted directly and therefore are excluded from being used in derived query methods.
Consider the following Item
having a categoryId
property that is mapped to the field named cat.id
.
public class Item {
@Field(name = "cat.id", fieldNameType = KEY)
String categoryId;
// ...
}
它的原始表示将类似于
Its raw representation will look like
{
'cat.id' : "5b28b5e7-52c2",
...
}
由于我们不能直接定位 `cat.id`字段(因为这会被解释成一条路径),所以我们需要 Aggregation Framework的帮助。
Since we cannot target the cat.id
field directly (as this would be interpreted as a path) we need the help of the Aggregation Framework.
template.query(Item.class)
// $expr : { $eq : [ { $getField : { input : '$$CURRENT', 'cat.id' }, '5b28b5e7-52c2' ] }
.matching(expr(ComparisonOperators.valueOf(ObjectOperators.getValueOf("value")).equalToValue("5b28b5e7-52c2"))) 1
.all();
1 | The mapping layer takes care of translating the property name value into the actual field name.
It is absolutely valid to use the target field name here as well. |
template.update(Item.class)
.matching(where("id").is("r2d2"))
// $replaceWith: { $setField : { input: '$$CURRENT', field : 'cat.id', value : 'af29-f87f4e933f97' } }
.apply(AggregationUpdate.newUpdate(ReplaceWithOperation.replaceWithValue(ObjectOperators.setValueTo("value", "af29-f87f4e933f97")))) 1
.first();
1 | The mapping layer takes care of translating the property name value into the actual field name.
It is absolutely valid to use the target field name here as well. |
上面展示了一个简单的示例,其中特殊字段出现在顶级文档级别。嵌套级别的增加增加了与该字段交互所需的聚合表达式的复杂性。
The above shows a simple example where the special field is present on the top document level. Increased levels of nesting increase the complexity of the aggregation expression required to interact with the field.
Customized Object Construction
映射子系统允许通过使用 @PersistenceConstructor
批注为构造函数添加批注来定制对象构建。要用于构造函数参数的值按以下方式解析:
The mapping subsystem allows the customization of the object construction by annotating a constructor with the @PersistenceConstructor
annotation.
The values to be used for the constructor parameters are resolved in the following way:
-
If a parameter is annotated with the
@Value
annotation, the given expression is evaluated and the result is used as the parameter value. -
If the Java type has a property whose name matches the given field of the input document, then it’s property information is used to select the appropriate constructor parameter to pass the input field value to. This works only if the parameter name information is present in the java
.class
files which can be achieved by compiling the source with debug information or using the new-parameters
command-line switch for javac in Java 8. -
Otherwise, a
MappingException
will be thrown indicating that the given constructor parameter could not be bound.
class OrderItem {
private @Id String id;
private int quantity;
private double unitPrice;
OrderItem(String id, @Value("#root.qty ?: 0") int quantity, double unitPrice) {
this.id = id;
this.quantity = quantity;
this.unitPrice = unitPrice;
}
// getters/setters ommitted
}
Document input = new Document("id", "4711");
input.put("unitPrice", 2.5);
input.put("qty",5);
OrderItem item = converter.read(OrderItem.class, input);
|
The SpEL expression in the |
可以在 MappingMongoConverterUnitTests 测试套件中找到使用 @PersistenceConstructor
注释的其他示例。
Additional examples for using the @PersistenceConstructor
annotation can be found in the MappingMongoConverterUnitTests test suite.
Mapping Framework Events
事件在映射过程的生命周期中触发。这在 Lifecycle Events 部分中有说明。
Events are fired throughout the lifecycle of the mapping process. This is described in the Lifecycle Events section.
在 Spring ApplicationContext 中声明这些 Bean 会让它们在事件被分派时被调用。
Declaring these beans in your Spring ApplicationContext causes them to be invoked whenever the event is dispatched.